Quantcast
Channel: SQL Server Analysis Services forum
Viewing all 14337 articles
Browse latest View live

SQL 2014 Server Dabatase Engine Tuning Advisor crashes.

$
0
0

We installed new instances of SQL Server 2014, migrated DBs from SQL Server 2008 - and DTA started to crash at "Generating Reports" step.

_________________________________________________________
Fault bucket , type 0

Event Name: APPCRASH
Response: Not available
Cab Id: 0

Problem signature:
P1: DTAEngine.exe
P2: 2014.120.2000.8
P3: 5306c816
P4: StackHash_0301
P5: 6.3.9600.17736
P6: 550f42c2
P7: c0000374
P8: PCH_3A_FROM_ntdll+0x0003CA2C
P9: 
P10: 

Attached files:
C:\Users\BSydor\AppData\Local\Temp\WERBFAE.tmp.WERInternalMetadata.xml
C:\Users\BSydor\AppData\Local\Temp\WER65D3.tmp.appcompat.txt
C:\Users\BSydor\AppData\Local\Microsoft\Windows\WER\ReportQueue\AppCrash_DTAEngine.exe_19fc8d241354263c66005d209bd71dc0cf64a_cdf0a64c_cab_14b867a6\memory.hdmp
C:\Users\BSydor\AppData\Local\Microsoft\Windows\WER\ReportQueue\AppCrash_DTAEngine.exe_19fc8d241354263c66005d209bd71dc0cf64a_cdf0a64c_cab_14b867a6\triagedump.dmp

These files may be available here:
C:\Users\BSydor\AppData\Local\Microsoft\Windows\WER\ReportQueue\AppCrash_DTAEngine.exe_19fc8d241354263c66005d209bd71dc0cf64a_cdf0a64c_cab_14b867a6

Analysis symbol: 
Rechecking for solution: 0
Report ID: f9424b21-2f92-11e5-82aa-0c8bfd9f322d
Report Status: 100
Hashed bucket: 


Uninstall Clustered Instance from Node

$
0
0

I've installed SSAS 2012 as a failover cluster instance on a 2-node Windows 2012 R2 cluster.  All was fine in that regard but I now need to change the instance name so am going about the process of uninstalling and re-installing.  I've successfully removed the second node from the configuration using the "Remove cluster node" wizard from the Setup options, but when it comes to removing it from the final node I'm getting the error "the cluster group **** could not be moved from node x to node 'null'".  At this point the uninstall fails and the cluster resource remains in cluster administrator.  It's obviously done some work/damage because the service won't start now, but it's left in limbo land where I can't do anything with it.

Is there a manual way of removing the instance to overcome this issue (file deletions, registry entry deletions etc.)?  (usual caveats regarding registry modifications accepted!) :)

Many thanks

Phil

SSAS ROLAP Aggregation Processing Issue.

$
0
0

As we planned to create an existing cube (which is with MOLAP storage mode), we have started converting the existing measure group & partition properties to ROLAP storage mode.

On an initial note, The Cube does have the MOLAP measure groups & aggregations. To convert the measure groups to ROLAP mode, have changed the measure group & partition storage mode to ROLAP.

After deploying & Processing the cube, have come across an issue with the processing by mentioning that Aggregation schema binding was not done.

By only deleting the aggregations, Could be able to process the respective measure groups.

Is there any chance to have the measure groups in ROLAP & the existing aggregations are also can be used?

SSAS ROLAP Aggregation

 

DAX with IF Statement very slow

$
0
0

Hi,

I have been trying to optimize a few DAX calculations that I have in my cube and ran into this problem

Basically my DAX Measure 

Counterparty Collateral Base DKK:=

[Counterparty Collateral Contracts DKK (filtered)] 

CALCULATE(-[Account Balances DKK], Accounts[Counterpart Type ID] <> 8, Accounts[ContractTypeID] <> 14, Accounts[ContractTypeID] <> 16)

is pretty fast,

whereas as soon as I change the 2nd part to only add Account Balance if its>0, it takes ages i.e.

Counterparty Collateral Base DKK:=

[Counterparty Collateral Contracts DKK (filtered)] 

IF(CALCULATE(-[Account Balances DKK], Accounts[Counterpart Type ID] <> 8, Accounts[ContractTypeID] <> 14, Accounts[ContractTypeID] <> 16)>0,

CALCULATE(-[Account Balances DKK], Accounts[Counterpart Type ID] <> 8, Accounts[ContractTypeID] <> 14, Accounts[ContractTypeID] <> 16),0)

Offcourse because I check every row I suppose with that "if" statement. Question is how do I improve it?

By the Way: [Account Balances DKK] is in the table - Account Balances

                    [Counterparty Collateral Contracts DKK (filtered)]  is in the table- Open Positions

This calculated measure [Counterparty Collateral Base DKK] is in Open Positions

And the query in MDX, that I run to pull this measure, looks like

 select {[Measures].[Counterparty Collateral Base DKK]
}on 0,
Non Empty {[Accounts].[Counterpart ID].children
  *[Base Currencies].[Currency Code].children
  *[Products].[Product Name].children
  } on 1
from [model]
where [Dates].[Date].&[2015-06-30T00:00:00]

The requirement is such that I have to add the measures in two separate tables.

How can I improve the performance when adding the filter?

Any help would be appreciated.


                         

SSAS DRILLTHROUGH - Specified query is too complex error.

$
0
0

Hi Guys,

I'm working on an SSAS 2012 OLAP cube with around 100 dimensions and a single measures group. When I try to run a simple DRILLTHROUGH query such as:

DRILLTHROUGHMAXROWS 1000 SELECTFROM [Cube] WHERE ([Measures].[Fact_Count])

I get an error as below.

The specified query is too complex to be evaluated as a single statement.

Does anyone have any experience of this error and how to work around it? If I remove the majority of the dimensions from the cube I can run this query successfully. Looking at the limits set for analysis services I should be well within these.

Thanks in advance 

Sean



Looping AMO objects Vs Using SSAS Built-In Parallel Processing

$
0
0

I am creating an SSIS Script Task that will be used to process SSAS dimensions and partitions and ideally log the details of each in a table.

I was hoping someone could weigh in on the benefits or drawbacks of using the built-in SSAS parallel processing as opposed to doing it manually in a multi-threaded "Parallel.Foreach" loop using the .NET AMO library.

In my testing, when I use a Parallel.foreach loop, I am able to obtain and log information about the object such as end time and time to process immediately after each object is processed.  This allows me to keep a history of processing time for each object.

I haven't found a way to get this same detailed progress information for each dimension or partition when sending a Parallel XMLA command.

If circumventing the built-in SSAS parallel processing is not best practice I'd like to know in advance before I go too far down that path.  Thanks in advance for any input!

Corrupted Aggregation File on ProcessIndex

$
0
0

I trying to update my Aggregation Design for a partition using BIDS Helper.  The current aggregation design contains about 60 aggregations and the new aggregation I am trying to add is across 5 dimension attributes, the product of which is about 500,000 unique values.  The fact table is about 13,000,000 rows.

When I deploy the aggregation and run ProcessIndex, I get the follow error:

File system error: The following file is corrupted: Physical file: \\?\E:\Program Files\Microsoft SQL Server\MSAS10_50.MSSQLSERVER\OLAP\Data\TestDB.14.db\cube_2.607.cub\cubemeasuregroup_2.633.det\DefaultPartition_2.579.prt\852.agg.flex.data. Logical file.

If I remove the new aggregation, deploy, and run ProcessIndex again, it processes fine.

Is there some file size limitation I am running into?  The agg.flex.data file is 7.8 GB before adding the new aggregation, so it isn't subject to the same 4 GB limit as .asstore.

Thanks,

Mitch

Windows Server 2008 64 bit

SQL Server 2008 R2 (10.50.1746.0) 64 bit

What are the size limits of file *.agg.flex.data?

$
0
0

What are the size limits of file *.agg.flex.data ?These files are typically located at SSAS data directory.

While processing the cubes with "Process Index", I am getting below error message:

File system error: The following file is corrupted: Physical file: \?\F:\OLAP\<DB_Name>.0.db\<Cube_Name>.0.cub\<Name>.0.det\<Partition_Name>.0.prt\33.agg.flex.data. Logical file.

However, if I navigate to the location mentioned in the error message, the specified file is not present(at given location).

I have referred a similar query regarding Corrupted Aggregation File on ProcessIndex,

<link: https://social.msdn.microsoft.com/Forums/sqlserver/en-US/2c1f677a-60c5-4a5f-99ec-daaffd564ecb/corrupted-aggregation-file-on-processindex> 

but cannot implement/execute the proposed solution as I don't have the associated fact tables(source data) to Perform"Process Data" operation.

I have checked the different agg.flex.data file sizes, these file sizes were varying from ~200MB to ~5GB. Is there any size limit configuration associated with these files?If yes, can we change it inSQL Server 2008 R2. Else any Update(provided by Microsoft) which takes care of such configurations.

Execution Environment:

Windows Server 2008 R2 Enterprise, SQL Server 2008 R2 + SP1 , 64GB RAM

If anyone have faced such issue earlier please help. Any help would be highly appreciated.


Masking the dimension data in ssas based on role

$
0
0

Hi..

The requirement is to mask dimension data to “N/A” from certain users in SSAS Cube based on role. For example I have my Employee data with Employeedateofbirth, EmployeeName, EmployeeLocation in the Employee Dimension..

For Normal user should see like below:

Employeedateofbirth     EmployeeName              EmployeeLocation

01/01/2015                        Employee1                        UK                     

02/02/2015                        Employee2                        USA                      

 

For a restricted user, should see results in cube, excel and SSRS like below as per role..

Employeedateofbirth     EmployeeName              EmployeeLocation

01/01/2015                        N/A                                       N/A

02/02/2015                        N/A                                      N/A

This we have to Achieve in Cube, Excel and SSSR reports. Please kindly help on this..

Thank you..

Excel with Xlcubed Freeze after Refresh of Grid - Audit Logout Event

$
0
0

Hi,

I have a very small cube, 5 Dimension and 1 Fact with some hundred records which shows me the InstanceLocks after Processing and syncing. I also have a little Grid made with XLCubed. Today I wanted to refresh the Grid. The Excel was opened for some hours. After clicking refresh, the Excel was freezing and in the Profiler I didn't see any work. I opened a new Excel made the same Grid and after 1 second I got my result. The original Excel was still freezed. After Approximatly 15 minutes the original finished. When I looked into profiler I saw that an AuditLogout Event was fired and after that the query was processed. I got no messages in the Excel about closed connections, sessions.

Any ideas? Thanx in Advance

Jürgen

Binomial Distribution Function in MDX

$
0
0

Hello Experts,

I am here with a doubt in MDX. I want to calculate the Binomial Distribution for one of the Measure in my cube. The requirement is to create a new calculated measure in the cube which calculates the binomial distribution for an inbuilt measure.

Please let me know how can I do this in Cube. I am using SSAS 2012 Standard Edition.

Thanks,

Ruchika

How to set Attributes in sort order for a dimension in SSAS cube?

$
0
0

Hi,

For Example: I have one dimension named as "Name", Under this I have "FirstName" and "LastName" Attributes are there.

But when i drag "Name" dimension, By default "First Name" dragged. But i Want "Last Name" should drag.

Hope You understand my Question. 

Can you please help on this. Thank You.

Nagaraju



MDX - Non Empty and Subset

$
0
0

Actually I am trying to create a SUBSET on the data that gets returned from the NON-EMPTY row but its not coming out fine...Is it possible to use a SUBSET on a NON EMPTY???

SELECT

NONEMPTY

{

      [Measures].[Cnt]

}ON 0,

NONEMPTY

(    

      {Prov}

      ,{[Employee].[License Number].[License Number]}

      ,{[Employee].[First Name].[First Name]}

      ,{[Employee].[Last Name].[Last Name]}

      ,{[Employee].[Practice Name].[Practice Name]}

      ,{[Employee].[Address Line1].[Address Line1]}

      ,{[Employee].[City].[City]}

      ,{[Employee].[Zipcode].[Zipcode]}

      ,{[Employee].[State].[State]}

      ,{[Employee].[Phone].[Phone]}

      ,{[Employee].[Noof Locations].[Noof Locations]}

      ,{[Location].[State].&[MD]}

)

ON 1

FROM Cube

SQL 2012 BIDS/SSDT - Can I use VS 2012 instead of 2010?

$
0
0

Ive got SQL 2012 Developer edition installed locally on my workstation. Somehow SSDT got uninstalled when I loaded SSDT for Visual Studio 2012.

Does SSDT for SQL 2012 use the VS 2010 Shell?  Can I force it to use VS 2012?

SSAS Calculated Measures - Newbie Question

$
0
0

Team, 

I have the calculation below doing comparison between Current and Previous members of a Fiscal Date Dimension. 

The issue that I have is, when I pick 2014 and 2015, this shows the comparison between 2013/2014 and 2014/2015, while I expect only to see the comparison between the current members (2014/2015). 

What can I do in order to achieve the above?

(
[Ship Date - Fiscal].[Year - Fiscal].CurrentMember,[Measures].[Line Total - Unformatted])-
([Ship Date - Fiscal].[Year - Fiscal].PrevMember,[Measures].[Line Total - Unformatted]
)


SSAS design tip to measure across date range

$
0
0
Developing a Retail cube using SSAS 2012. One of the dimension is DimCustomer with SCD type II. Each Customer can be a member or a non-member over a period of time. We have StartDt and EndDt to reflect the membership status.

eg:
Joe is a member between 06-01-2014 and 31-08-2014
Joe is a non-member between 09-01-2014 and 01-31-2015
Joe is a member between 02-01-2015 and 04-30-2015
Joe is a non-member between 05-01-2015 and 12-31-9999

Without adding fact row of Joe for each day to reflect the membership status, I want to provide the ability to measure "Active Customers Count" on a given date. There are 2 million customers in the DimCustomer Table.

Any design tip is greatly appreciated.

ravi

MDX Format(Now(),'yyyy-MM-ddTHH:mm:ss') is showing NULL

$
0
0

Member CurrentTime as SUM({LASTPERIODS(1,StrToMember('[Date].[Date].&[' + Format(Now(),'yyyy-MM-ddTHH:mm:ss') + ']'))},CoalesceEmpty([Measures].[Total Rows], 0))

 Member CurrentTimeFixed as SUM({LASTPERIODS(1,[Date].[Date].&[2015-07-21T00:00:00])},CoalesceEmpty([Measures].[Total Rows], 0))

I am trying to make my calculated members dynamic using Now() but whenever I try it, it will display Null values. However, when I hardcode the date, it will give me what I want. Any ideas on what might be the problem?

When building a new SSAS project, what is the best practice regaridng underlying schema changes?

$
0
0

I seem to having a really hard time when making schema changes.  Adding a new fact table, renaming a dimension column, adding a new measure or renaming an existing one. Somehow these things tend to cause problems at one or more stages - either running the schema wizard or processing the cube.

So I ask, what is your overall strategy for adding or updating new schema updates into your DB with regards to the SSAS project?

Another way to ask this - is how often do you find yourself deleting the the DSV and / or the whole cube and starting over because some schema change lead to a cascade of issues that just didnt seem to want to let you correct them?

Scoping role playing dimension

$
0
0

Hi,

I have a pretty simple question. Is it possible to scope a role playing dimension only hitting the one with a certain role. An example would be to scope "Delivery Date.Members, Sales" without it having impact on "Sales Date.Members, Sales"

Or do I really need to import the same table twice in my DSV?

Thanks.

SSAS 2008 A relation already exists for these child columns

$
0
0

Not sure where the culprit is for this error propping up.

Ive added a new measure group and Im trying to generate the schema for the resulting fact table, but something isn't right.  However, I don't think its my new measure group that's causing the issue, because I removed it and still get this issue when I select Database > Generate Relational Schema

Viewing all 14337 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>