Quantcast
Channel: SQL Server Analysis Services forum
Viewing all 14337 articles
Browse latest View live

DAX SELECTEDVALUE with RLS

$
0
0

I have a scenario where some users need to be able to impersonate others and when they select the users they want to impersonate on a Power BI report the RLS should reflect their selection. I tried adding in SELECTEDVALUE to the RLS filter, however I only get back the Alternate Result, e.g.:

SELECTEDVALUE ( 'Staff'[Staff Name], "Unknown" )

If I add this as a measure to my Power BI report I receive the expected result when filtering Staff Name on a slicer.

I've also tried the following:

IF (ISFILTERED ( 'Staff'[Staff Name] ), VALUES('Staff'[Staff Name] ), "Unknown")

IF(HASONEVALUE('Staff'[Staff Name] ), VALUES('Staff'[Staff Name] ), "Unknown")

Is there a limitation when using these functions with SSAS 2017 RLS?


Thanks



Existing not working as excepted

$
0
0

Hello All,

  Am using the below scope to sum up all the selected state values into the all level .

scope([Geography].[State].[All],[Measures].[Nd Num 1]);
this=sum(existing([Geography].[State].[State]),[Measures].[Nd Num 1]);
end scope;

Even if i filter only one state at All level am getting the sum of all the states instead of the filtered states

Please not the aggregation type for [Nd Num 1] is Max and only across state am trying to sum it .

Can anyone please let me know the issue here.

Regards,

Bharath

How to create multiple Measure Groups based on single fact table

$
0
0

Hi all,

please suggest anyone how to create multiple measure groups based on single fact table.

i've tried by creating the named query,but it was not working.

please suggest me  any another approach.

Thanks and Regards,

Saidarao

SQL Linked server OLAP Cube - converting the mdx olap measures to float in sql?

$
0
0

Good day,

I have an ssis pkg that use to data conversion on an mdx query via linked server but now im getting all kinds of metadata issues on the measures, even though ValidateExternalData = False it fails.

I am trying to do the convertion in the sql linked server query, but somehow I cant refence the measure name

SELECT * 
,cast([Measures].[Budget Volume] as float) as BudgetVolume
from OpenQuery(DW_AS_SALES,'SELECT NON EMPTY {[Measures].[Sales Volume Year Minus1]
				, [Measures].[Budget Volume] 
				, [Measures].[Sales Volume]
				, [Measures].[Sales Value Year Minus1]
				, [Measures].[Budget Value]
				, [Measures].[Sales Value Excl] } ON COLUMNS 
	    , NON EMPTY {([Company Dimension].[Company].[Company].ALLMEMBERS 
				  * [Sales Indicator].[Sales Indicator].[Sales Indicator].ALLMEMBERS 
				  * [Depot Sales Dimension].[Depot].[Depot].ALLMEMBERS 
				  * [Item Dimension].[Parmalat Main Category].[Parmalat Main Category].ALLMEMBERS 
	) } ON ROWS 

 FROM [SSAS DW Sales]
		WHERE (   [Date Dimension].[Date Year].&[2019]
		 , [Item Dimension].[Item Commercial Filter].&[Commercial Items] )')

Any help on converting the mdx olap measures in the sql?

Regards

Benefits of Partitioning a SSAS Tabular Cube

$
0
0

Hi All,

I have a few questions as below,

1)Is there any other benefits in partitioning a tabular cube other than processing performance?

2)Would there be any effect on query performance?

3)If yes,how does the partition elimination works?Do we have to do that manually?

4)Is it possible to manually point a tableau/ssrs report to only select data from a certain partition and don't read other?

Can someone answer all my questions or point to right source where i can a quick read, so that I can have better idea please?

Thanks,

Jenni

Benefits of Partitioning a SSAS Tabular Cube

$
0
0

Hi All,

I have a few questions as below,

1)Is there any other benefits in partitioning a tabular cube other than processing performance?

2)Would there be any effect on query performance?

3)If yes,how does the partition elimination works?Do we have to do that manually?

4)Is it possible to manually point a tableau/ssrs report to only select data from a certain partition and don't read other?

Can someone answer all my questions or point to right source where i can a quick read, so that I can have better idea please?

Thanks,

Jenni

Do you want to be acknowledged as Microsoft SSAS Guru? Submit your work to January 2020 competition!

$
0
0


What is TechNet Guru Competition?

Each month the TechNet Wiki council organizes a contest of the best articles posted that month. This is your chance to be announced as MICROSOFT TECHNOLOGY GURU OF THE MONTH!

One winner in each category will be selected each month for glory and adoration by the MSDN/TechNet Ninjas and community as a whole. Winners will be announced in dedicated blog post that will be published in Microsoft Wiki Ninjas blog, a tweet from the Wiki Ninjas Twitter account, links will be published at Microsoft TNWiki group on Facebook, and other acknowledgement from the community will follow.

Some of our biggest community voices and many MVPs have passed through these halls on their way to fame and fortune.

If you have already made a contribution in the forums or gallery or you published a nice blog, then you can simply convert it into a shared wiki article, reference the original post, and register the article for the TechNet Guru Competition. The articles must be written in January 2020 and must be in English. However, the original blog or forum content can be from beforeJanuary 2020.

Come and see who is making waves in all your favorite technologies. Maybe it will be you!


Who can join the Competition?

Anyone who has basic knowledge and the desire to share the knowledge is welcome. Articles can appeal to beginners or discusse advanced topics. All you have to do is to add your article to TechNet Wiki from your own specialty category.


How can you win?

  1. Please copy/Write over your Microsoft technical solutions and revelations to TechNetWiki.
  2. Add a link to your new article on THIS WIKI COMPETITION PAGE (so we know you've contributed)
  3. (Optional but recommended) Add a link to your article at the TechNetWiki group on Facebook. The group is very active and people love to help, you can get feedback and even direct improvements in the article before the contest starts.

Do you have any question or want more information?

Feel free to ask any questions below, or Join us at the official MicrosoftTechNet Wiki groups on facebook. Read More about TechNet Guru Awards.

If you win, people will sing your praises online and your name will be raised as Guru of the Month.


PS: Above top banner came from Syed Shanu.

Thanks,
Kamlesh Kumar

If my reply is helpful please mark as Answeror vote as Helpful.

My blog | Twitter | LinkedIn

SQL to DAX

$
0
0

Hi,

How to convert this SQL query to DAX

SELECT 
       f.[Calendarkey],
       f.[TotalSalesAmount]
  FROM [fact].[InternetSales] f
  left join  [Dim].[Calendar] D
  on f.[CalendarKey]=D.[CalendarKey]
  where  [OrderDate] >= DATEADD(YEAR, -2, DATEADD(DAY, DATEDIFF(DAY, 0, GETDATE()), 0)) 

I tried this but there are errors..

EVALUATE
CALCULATETABLE(
SUMMARIZECOLUMNS(
'InternetSales'[CalendarKey]
,'InternetSales'[TotalSalesAmount]
,FILTER(ALLNOBLANKROW('Calendar'[OrderDate]), 'Calendar'[OrderDate] >= DATEADD(YEAR, -2, DATEADD(DAY, DATEDIFF(DAY, 0, TODAY()), 0)))
)
)

Thanks,

Jenni



Errors in the high-level relational engine. A connection could not be made to the data source with the DataSourceID of '

$
0
0

When I deploy the cube which is sitting on my PC (local) the following 4 errors come up:

Error 1 The datasource , 'AdventureWorksDW', contains an ImpersonationMode that that is not supported for processing operations.  0 0 
Error 2 Errors in the high-level relational engine. A connection could not be made to the data source with the DataSourceID of 'Adventure Works DW', Name of 'AdventureWorksDW'.  0 0 
Error 3 Errors in the OLAP storage engine: An error occurred while the dimension, with the ID of 'Customer', Name of 'Customer' was being processed.  0 0 
Error 4 Errors in the OLAP storage engine: An error occurred while the 'Customer Alternate Key' attribute of the 'Customer' dimension from the 'Analysis Services Tutorial' database was being processed.  0 0 

SSAS 2008 taking all server memory

$
0
0

We just upgraded to SSAS 2008 64 bit (and windows 2008 R2) a couple of weeks ago and now we have an issue that I can't seem to figure out.  Our cubes have not changed at all.  Previous to the upgrade we were running server 2003 and SSAS 2005 54 bit.

The issue is over the past week the SSAS servers have decided to take up almost all the memory (about 90%) on the server and render itself useless.  It is the SSAS service that is taking up almost all the memory and if I stop other services to free up some it just takes that up too.  So here is a break down of the issue and what I have tried.  The issue seems to be happening faster and faster now so I am at a loss currently.

Setup that has issue:

  • SSAS 2008 SP1 64 bit on SQL 2008 R2
  • Default configs as that worked previously in 2005
  • not structure changes
  • machine has 2 quad core procs and 16 GB mem
  • accessed via web calls and linked server on SQL server on different machine

What I have tried

  • install CU10 (due to other issue)
  • rebuild aggregations
  • reduce memory limits to 40% for low and 50% for max respectively

Results of what I have tried so far: memory still hits the 90% within 24 hours so that tells me the changes have done nothing.  I don't know what to do. 

I know that there are DMVs now in 2008 but have no idea how to access them. Is there anyway to find out what is taking up all the memory?  I am trying to find out if it is something in the website code that access the machine because when I point the websites to another SSAS machine (which eventually gets the issue too) the memory clears out after 15-20 min.

Any help would be greatly appriciated.  I am trying to avoid calling Microsoft unless absolutely necessary but it is on the options list.

 

Invoke-ProcessASDatabase database compatibility level of 1500 is below the minimal compatability level of 2147483647

$
0
0

Slightly curious one here, I have a Tabular Model deployed to Azure with compatibility level 1500 making use of Calculation Groups. Can process the model fine via SSMS but I have Runbook set up to automate the refresh the model using the Invoke-ProcessASDatabase cmdlet and am getting the error:

"The database compatibility level of 1500 is below the minimal compatability level of 2147483647 needed..."

My script is pretty simple and has worked fine for previous models on lower compatibility levels:

$AzureCred = Get-AutomationPSCredential -Name "MyCredential"
Add-AzureAnalysisServicesAccount -RolloutEnvironment 'xxxxx.asazure.windows.net' -ServicePrincipal -Credential $AzureCred -TenantId "XXX-XXX-XXX-XXX"
Invoke-ProcessASDatabase -Server "asazure://xxxxx.asazure.windows.net/xxxxx" -DatabaseName "XXXXX" -RefreshType Full

Anyone else come across this? My search yielded no results. Is this a bug or something I need to differently with the new model level?

Thanks
Ian



Ian Roberts

Connection failure - The JSON DDL request failed with the following error: Failed to execute XMLA

$
0
0

We are facing huge issue after deployment of xmla code. XMLA code deployment is fine. After that we are refreshing msdax. When we do we are getting below error: 

The JSON DDL request failed with the following error: Failed to execute XMLA. Error returned: 'OLE DB or ODBC error: A network-related or instance-specific error has occurred while establishing a connection to SQL Server. Server is not found or not accessible. Check if instance name is correct and if SQL Server is configured to allow remote connections. For more information see SQL Server Books Online.; 08001; Client unable to establish connection; 08001; Encryption not supported on the client.; 08001. 
A connection could not be made to the data source with the Name of 'DBNAME'

Few times, after restarting services & server sometimes it works and it not. We are not able to understand after deployment why it is not refreshing. BTW, we are running below code using SSIS package even when DBA manually runs it fails.

Connections are fine. impersonation is also fine.

{
  "refresh": {
    "type": "full",
    "objects": [
      {
        "database": "DBNAME"
      }
    ]
  }
}

SSAS - Connectionstring: properties Provider=SQLNCLI11.1;Data Source=servername;Persist Security Info=False;Integrated Security=SSPI;Initial Catalog=dbname

impersationinfo: Impersonation

Database properties: Default Mode: Import, Default Dataview: Full

Our Service id also added and full permission.

"The processing information cannot be provided" when deploying SSAS from VS 2017

$
0
0

Hello 

I am using Microsoft Visual Studio Tools for Applications 2017 for a multidimensional SSAS project.

When I try to deploy the solution from VS to a local database, a window opens with the following error message:

The processing information cannot be provided. (Microsoft Visual Studio)
Object reference not set to an instance of an object. (Microsoft.AnalysisServices.Project.AS)
Program Location:
   at Microsoft.AnalysisServices.Project.AnalysisServicesProjectManager.unsafe_StartToolWindowForProcessing(String message)


When I close the window, the deployment and the processing of the cube continue without any error.

Does anyone know what the problem is?
Thanks in advance
Chris

Through IIS manager not able to connect to Analysis Services in 2017 version

$
0
0
Through IIS manager not able to connect to Analysis Services in 2017 version

http://server_name/olap/msmdpump.dll is not working while connecting to SSAS 2017 version, till 2016 version it's working fine

Object reference not set to an instance of an object error

$
0
0

My situation is as follows

I am using Visual studio community version 2015. I want to compare two tabular models to generate change script and so I installed BISM Normalizer version 3 

1. First of all, even though I installed the BISM Normalizer, I don't see it in the list of installed programs ( I use windows 10), why is that ?

2. OK, then I proceed to open a database in VS 2015, Go to Tools --> Analysis Services Tabular Designers and change the compatability version to SQL SERVER 2014/ SQL SERVER 2012 SP1 (1103)....then I say tools --> compare tabular models, and when I choose both the  models to compare I get a dialog box error which says Object reference not set to an instance of an object. why am I getting this error ??


Possible to track changes in the Tabular models between refreshes?

$
0
0

Is there any mechanism or ability to compare two tabular data models or compare the model before and after the refresh?

When the server refreshes it loads all the data into a new model, then drops the old model, right?

Is there any mechanism that would allow querying between the two models to look for changes?

BISM Normalizer

$
0
0

Hi,

I'm using VS 2015 "Tabular compare" to compare two cubes from two different servers. As soon as I click on to compare, getting error "Object reference not set to instance of object." I tried to point to two solutions as well but getting the same error.

Not sure how to get rid off this error. Please suggest.

Thanks!!

Dynamic configurations for Tabular SSAS Data Source in Visual Studio

$
0
0

Hi,

I am using different configurations for my SSAS Tabular project in Visual Studio. I would like to set different Data Sets for these configurations (development, deployment) as well. Is there any way how to achieve "dynamic" Data Sets for my configurations in Visual Studio?

Note: I can do it for SSAS Multidimensional project in Solution Explorer. But Data Source is not available inSolution Explorer for Tabular project. Data Set is only within Tabular Model Explorer where change of Data Set is used for all configurations.

Thank you,
Fenix




SQL Case statement to DAX

$
0
0

I am trying to write a DAX statement to replicate a SQL Case statement which involves two tables TABLEa &TABLEc which are connected usingTABLEb in between them. Ideally I want to create them as measures. With other BI tools I am able to use case and combine columns from 2 or even 3 tables. I do not have a datawarehouse this is a simple transactional data base.

Measure1: CASE WHEN TABLEc.StartDate >= TABLEa.StartDate AND (Tablec.ComDate <= TABLEa.StartDate OR TABLEa.StartDate S NULL) THEN 1 ELSE 0 END

Measure2: COUNT(CASE WHEN TABLEc.StartDate >= TABLEa.StartDate AND (TABLEc.StartDate <= TABLEa.StartDate OR TABLEa.StartDate IS NULL) THEN TABLEc.Id END)

Power Query Designer in SSAS using Visula Studio 2019

$
0
0

Hi,

I have used Power Query designer extensively in Power BI and now trying to use it in SSAS.

I am using Visual Studio 2019 to create a SSAS project (compatibility level 1500)  and I can import data from SQL 2019 table/view. 

My questions:

1. How do I access Power Query in SSAS?

I tried to find the button to do this but only way I can access is via Table Properties -> click 'Design...' button. But I can only see the selected table in the query. Unlike in Power BI in the Query Designer/Editor I can see all data import queries.

2. When I added a custom column some it is visible in the model.bim window. It has also disappeared from the script.  Why is this?

This scenario is specifically when I try to do add transformation after  the dat source selection and data is imported. During the data source selection when I click 'transform' and do any changes in Power Query designer, that displays fine.

3. Sometimes the custom column (added via Power Query post data import) is visible in model.bim, but the transformed data for that column is not visible. But after deploying the model to SSAS, when I access the model from Power BI or Excel the custom column values are visible. 

Why is the data for custom column added via Power Query not visible in Visual Studio project?

Any help will be greatly appreciated.

Thanks,

Anand

Viewing all 14337 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>