Quantcast
Channel: SQL Server Analysis Services forum
Viewing all 14337 articles
Browse latest View live

Power BI -SSAS Tabular Model - Quiz - need clarification on a question

$
0
0

I have this question before me.

You plan to use Power BI Desktop to create a report. The report will consume data from an on-premises tabular named SalesDB in Microsoft SQL Server Analysis Services (SSAS). The report will be published to the Power BI service. You need to ensure that the report published to the Power BI service will access the current data in SalesDB. What should you do?

A. Deploy an on-premises data gateway and configure the connection to SalesDB to use the Import Data Connectivity mode.

B. Deploy an on-premises data gateway and configure the connection to SalesDB to use the Connect live option.

C. Deploy an on-premises data gateway (personal mode) and configure to SalesDB to use the DirectQuery Data Connectivity mode.

D. Deploy an on-premises data gateway and configure the connection to SalesDB to use the DirectQuery Data Connectivity mode.

Is the correct answer B or D ?


QueryLogConnectionString

$
0
0

SQL 2014

I tried to get AS logging by populating QueryLogConnectionString.  It doesn't work.  It created the table but it's never populated.  After researching this further it seems as if this is a very unreliable way of logging, since it works intermittently.  So I want to stop logging.  I'm trying to follow this article which says to stop logging, clearing the QueryLogConnectionString stops AS from logging query statistics.  My question is, how the heck do you clear it?

https://technet.microsoft.com/en-us/library/cc917676.aspx

A related question, is there any good way of logging AS?  We want to see real usage stats, and running trace for a couple days crashed the server.  Is there any way of getting good stats from AS?

Thanks in advance


André

Customer Dimension. Some CustomerNo Values, key attribute, not showing

$
0
0

Hi, I have Customer Dimension with CustomerNo as key attribute connected to Sales fact table via CustomerNo key.

I have found that Customer Dimension does not show some CustomerNo values, say CustomerNo=10023, that exist in underlying Data Source View. I even checked it in VS Dimension->Browser.

I was wondering what could cause the exclusion?

Thank you

SSAS Tabular Foruns - stupid error hiding columns?

$
0
0

Hi,

i want to hide columns to client software, and when i hide them on SSDT the columns are dissapearing, 

in the past they only changed colors (and naturally, dissapeared in client tools)

is this normal, how to make themn visible again?

Regards

SSAS Tabular - how to manage code versions and DB

$
0
0

Hi,

i would like to ask for best practices to manage code versions (and DB databases) in different environments.

My idea is to have 3 Environments (DEV, QUA and PROD), starting from 3 different copies of the SSDT code, each one connected to the respective tabular DB.

The only difficulty i see here is to "not forget" doing some approved change from DEV in QUA or PROD, because as you know changing something in this type of code always needs to hide / unhide something, formatting values, creating some column, creating some measure, creating hierarchy, renaming and so on.

Is there any function in SSDT (light Visual studio) to manage the several versions of the code?

Regards

SSAS MDX for previous year same month data

$
0
0

Hello there,

I have a physical measure in cube and need to get respective previous year same month data.

Consider I have Month such as Dec 17, Jan 18, Feb 18.. Dec 18 etc. If there is only a month selected from the client tool such as excel or PowerBI I get the previous year same month data using CURRENTMEMBER.LAG(12) expression. However, this does not work for multiple selection.

Let’s consider I selected Dec 17 and Dec 18 in the client tool. In this scenario I should get aggregated data for Dec 16 and Dec 17 which is one year back for the respective selection.

Could you please help to build an expression in MDX that returns this kind of aggregated data?

Thanks,

Palash

Dax query to retreive the last bank date previous month

$
0
0

Hi

I have been struggeling with just what I think a simple thing like getting the last bankday date last month.

I have a Date-dimension with the date and a flag saying if it is a bankday or not. For some reason I don't get it to work. It is only returning the last date, not looking at IsBankDay...

LastBankDayPrevoiusMonth=  LASTDATE( PREVIOUSMONTH(FILTER(All(Dates);Dates[IsBankDay])))

I have tried so many variations now with PREVOIUSMONTH,ENDOFMONTH,MAXX etc but get empty or just the last date.

Is there someone that can point me in the right direction.

Have a nice weekend

Michael

Strange issue when processing SSAS Tabular via SSIS/SQL Agent - The JSON DDL request failed with the following error: Cannot execute the Refresh command: database 'XXX' cannot be found..

$
0
0

Hi all,

I've got an issue when trying to process a tabular model which is driving me nuts.  The issue is when processing the model (using Analysis Services Processing Task in SSIS), I get the following error: The JSON DDL request failed with the following error: Cannot execute the Refresh command: database 'xxx' cannot be found..

If I run the package directly from SSDT then the works fine, and the package also runs fine via the SQL Agent job in our dev environment.  I've tried looking online for some help but cannot find anything where someone has had the same issue.  I have a feeling that it may be a corrupt SQL installation as some of my other packages failed to run as the SQL agent job couldn't find them, even though they were there.  When I switched the proceeding packages off and ran the job then it found the previously missing package, but then couldn't find another one - strange behaviour.

Anyway, if anyone can help then I would be very grateful.

Thanks,

Mani


can`t add a reference Microsoft.AnalysisServices

$
0
0

My greetings to developers. I have a problem: can`t add a reference Microsoft.AnalysisServices.dll in Visual studio 2013 ultimate 12.0.31101.00 update 4  I have sql server 2014  RTM 12.0.2000.8.  enterprise. The version of dll is same as sql server- 12.0.0.0.8 The path to .dll is C:\Windows\assembly\GAC_MSIL\Microsoft.AnalysisServices\12.0.0.0__89845dcd8080cc91\Microsoft.AnalysisServices.dll.

I tried also another version of dll  10.0.25.31.0. but unfortunately it didn't help.  to my desperate tryings I have an idea to install another( previous) version of sql server(12)

Create Calculated Tables in SSAS Tabular Model using Tabular Object Model (TOM) scripts

$
0
0

 Hi Team,

I am trying to create an SSAS Tabular database programatically using the Tabular Object Model (TOM) C# scripts. I am able to create all the components of the tabular Model except for calculated tables and I am unable to find any solutions online.

Any insight on the same is highly appreciated. Thanks


SSAS is deploying to a new cube

$
0
0
When I update my SSAS cube and deploy it, all changes are deployed to a new cube on Analysis services database

deploying process is supposed to replace any pre-existing database with the contents of the project deployed. So why it is not deploying in the right cube and instead it is creating a new one with the same name_username_reference ? how can I avoid this ?

Who will be announced as the next Small Basic Guru? Read more about October 2018 competition!!

$
0
0


What is TechNet Guru Competition?

Each month the TechNet Wiki council organizes a contest of the best articles posted that month. This is your chance to be announced as MICROSOFT TECHNOLOGY GURU OF THE MONTH!

One winner in each category will be selected each month for glory and adoration by the MSDN/TechNet Ninjas and community as a whole. Winners will be announced in dedicated blog post that will be published in Microsoft Wiki Ninjas blog, a tweet from the Wiki Ninjas Twitter account, links will be published at Microsoft TNWiki group on Facebook, and other acknowledgement from the community will follow.

Some of our biggest community voices and many MVPs have passed through these halls on their way to fame and fortune.

If you have already made a contribution in the forums or gallery or you published a nice blog, then you can simply convert it into a shared wiki article, reference the original post, and register the article for the TechNet Guru Competition. The articles must be written in October 2018 and must be in English. However, the original blog or forum content can be from before October 2018.

Come and see who is making waves in all your favorite technologies. Maybe it will be you!


Who can join the Competition?

Anyone who has basic knowledge and the desire to share the knowledge is welcome. Articles can appeal to beginners or discusse advanced topics. All you have to do is to add your article to TechNet Wiki from your own specialty category.


How can you win?

  1. Please copy/Write over your Microsoft technical solutions and revelations to TechNetWiki.
  2. Add a link to your new article on THIS WIKI COMPETITION PAGE (so we know you've contributed)
  3. (Optional but recommended) Add a link to your article at the TechNetWiki group on Facebook. The group is very active and people love to help, you can get feedback and even direct improvements in the article before the contest starts.

Do you have any question or want more information?

Feel free to ask any questions below, or Join us at the official MicrosoftTechNet Wiki groups on facebook. Read More about TechNet Guru Awards.

If you win, people will sing your praises online and your name will be raised as Guru of the Month.


PS: Above top banner came from Baishakhi Banerjee.

Thanks,
Kamlesh Kumar

If my reply is helpful please mark as Answeror vote as Helpful.

My blog | Twitter | LinkedIn

Two separate tables of one-to-many relationships in dimensional modeling

$
0
0

Hello Everyone,

I am working on creating a star schema from a normalized database. I have millions of records in most of the tables. I have separated the many to many relationships and modeled them as new dimensions. Then, I denormalized the rest of the tables which had one to many relationship. But, due to the business requirements, I am keeping two or three tables as a separate dimensions which actually have foreign keys of other dimensions and have one to many relationship with them.

Suppose, product and supplier have one to many relationship, so they should be kept in one dimension. But, in my case, i am creating two different dimensions for them and providing a connection via fact.

Is this a correct way to do so? I have millions of records in both the tables, if I denormalize them, then the overhead will increase on a dimension table and then on fact.

Please advice me on how to approach this issue.


Null record in dimension tables

$
0
0

Hi,

I am new to dimensional modeling approach and I am trying to populate the dimension tables in star schema, but I am not sure about keeping a null record before doing a lookup on fact table. So, my question is:

1) Do I add a null record in each dimension first after populating it and then load the fact table?

2) Or should I first populate everything including dimensions and fact and then, add a null record in each dimension?

Thank You

SQL query

$
0
0

I am trying to get as below

I have table like below

Period         turnover

P1 (July)     200

P1 (July)     400

P2 (August) 300

P2 (August) 500

P3(September) 400

!

!

How can we get sum of Latest two period, Average of next two periods, and Average of next four periods Dynamically. Whenever new period updated like October, November.


Baljit singh


Encoding/Encoding Hints

$
0
0

Hi!

I have SQL Server 2017 and i have have a tabular model with a fact table with about 10 measures that all are money in the database and decimal in the tabular model. I have set all of them to have Encoding Hint to VALUE. When checking the model with Vertipaq Analyzer however, three of them ends up being HASH encoded and seven of them VALUE encoded. 

I am curious why? I would like all ten to be VALUE encoded or am I missing something? How can i "influence" the outcome so they all get to be VALUE encoded (i think i read somewhere that it is more preferable for measures) more than the EncodingHint? 

Using C# and AMO, How to find which Table/View/NamedQuery in DSV is coming from which Data Source?

$
0
0

Hi All,

I am working on extracting Cube Metadata as we have to do a massive refactoring. 

I have written a c# to extract list of Tables/Views/Named Queries from DSV but also looking for Data Source name for each table/View/NamedQuery.

For Example: In My Cube Database, I have 2 connections, One pointing to SQL Server DB (DataSource_SQL) and 2nd one to Oracle DB (DataSource_Oracle).

I have a DSV (as we know a single DSV can have Tables/Views/NamedQuery from multiple Data Sources ) which has tables from both the Data Sources (DataSource_SQL) & DataSource_Oracle

Using below code I am getting the list of Tables/Views/Named Queries from each DSV in a SSAS Database. but want to include Data source for each table.

This code is inside a SSIS Script Task

#region Namespaces
using System;
using System.Data;
using Microsoft.SqlServer.Dts.Runtime;
using System.Windows.Forms;
using System.IO;
using Microsoft.AnalysisServices;
using System.Data.SqlClient;

#endregion

/************************************************************************************
 * This Code is to Export SSAS Data Source View to a SQL Server Table
************************************************************************************/



namespace ST_ce1153fb3fc948db939934521530439e
{

[Microsoft.SqlServer.Dts.Tasks.ScriptTask.SSISScriptTaskEntryPointAttribute]
public partial class ScriptMain : Microsoft.SqlServer.Dts.Tasks.ScriptTask.VSTARTScriptObjectModelBase
{

public void Main()
{



            string olapServerName = Dts.Variables["$Package::pSSASServerName"].Value.ToString();
            string olapDatabaseName = Dts.Variables["$Package::pSSASDataBaseName"].Value.ToString();
            string dbServerConnectionString = Dts.Variables["$Package::pBDConnectionString"].Value.ToString();


            using (SqlConnection sqlDbCon = new SqlConnection(dbServerConnectionString))
            {
                sqlDbCon.Open();

                string sSSASDatabaseName;
                string sDataSourceViewName;
                string sTableFriendlyName;
                string sTableType;
                string sTableName;
                string sSchema;

                // connect to the OLAP server 
                Server olapServer = new Server();
                olapServer.Connect(olapServerName);
                if (olapServer != null)
                {
                    // connected to server ok, so obtain reference to the OLAP database
                    Database olapDatabase = olapServer.Databases.FindByName(olapDatabaseName);
                    if (olapDatabase != null)
                    {
                        sSSASDatabaseName = olapDatabaseName;

                        // export SQL from each data source view (usually only one, but can be many!)
                        foreach (DataSourceView dsv in olapDatabase.DataSourceViews)
                        {
                            Console.WriteLine(string.Format("Exporting SQL from DSV '{0}'", dsv.Name));

                            sDataSourceViewName = dsv.Name;

                            // for each table in the DSV, export the SQL in a file
                            foreach (DataTable dt in dsv.Schema.Tables)
                            {
                                sTableName = string.Empty;
                                sSchema = string.Empty;

                                // get name of the table in the DSV
                                // use the FriendlyName as the user inputs this and therefore has control of it
                                string queryName = dt.ExtendedProperties["FriendlyName"].ToString();

                                sTableFriendlyName = queryName;
                                sTableType = dt.ExtendedProperties["TableType"].ToString();


                                if (dt.ExtendedProperties["QueryDefinition"] != null)
                                {
                                    sTableType = "Named Query";
                                    sTableName = dt.ExtendedProperties["QueryDefinition"].ToString();
                                }
                                else
                                {
                                    sTableName = dt.ExtendedProperties["DbTableName"].ToString();
                                    sSchema = dt.ExtendedProperties["DbSchemaName"].ToString();
                                }

                                string cmdString = "INSERT INTO [SSASMultidimensionalStaging].[DataSourceView] " +
                                        "(SSASDatabaseName,DataSourceViewName,TableFriendlyName,TableType,[TableName],[Schema],ModifiedDateTime,CreatedDateTime)" +
                                        "VALUES (@val1, @val2, @val3, @val4, @val5, @val6, @val7, @val8)";
                                using (SqlCommand comm = new SqlCommand())
                                {
                                    comm.Connection = sqlDbCon;
                                    comm.CommandText = cmdString;
                                    comm.Parameters.AddWithValue("@val1", sSSASDatabaseName);
                                    comm.Parameters.AddWithValue("@val2", sDataSourceViewName);
                                    comm.Parameters.AddWithValue("@val3", sTableFriendlyName);
                                    comm.Parameters.AddWithValue("@val4", sTableType);
                                    comm.Parameters.AddWithValue("@val5", sTableName);
                                    comm.Parameters.AddWithValue("@val6", sSchema);
                                    comm.Parameters.AddWithValue("@val7", DateTime.Now);
                                    comm.Parameters.AddWithValue("@val8", DateTime.Now);
                                    comm.ExecuteNonQuery();
                                }
                            }
                        }
                    }
                }
                sqlDbCon.Close();
            }


            Dts.TaskResult = (int)ScriptResults.Success;
}

        #region ScriptResults declaration
        /// <summary>
        /// This enum provides a convenient shorthand within the scope of this class for setting the
        /// result of the script.
        /// 
        /// This code was generated automatically.
        /// </summary>
        enum ScriptResults
        {
            Success = Microsoft.SqlServer.Dts.Runtime.DTSExecResult.Success,
            Failure = Microsoft.SqlServer.Dts.Runtime.DTSExecResult.Failure
        };
        #endregion

}
}


Thanks Shiven:) If Answer is Helpful, Please Vote

MDX with YTD and PARALLELPERIOD slow?

$
0
0

Hi,

I am running the following MDX query and I think the query runs slow, it takes 8 seconds for 28462 rows and 9 columns.
The raw fact data is about 1.000.000 rows containing data on a monthly base over two years (24 months).

According to my SQL Profile trace the formula engine takes 99% of the time to execute, the storage engine only 1%.

I allready did the following:

1. Usage Based Optimization Aggregation and checked if the aggregation is used in SQL Profiler (this is the case).
2. Tried to rewrite the MDX in differente way's.
3. Set NON EMPTY BEHAVIOR on the calculated measure.
4. Checked the configratution of my DateTime dimension
5. Checked attribute relationships

According to this article: https://sqldusty.com/2014/03/07/do-you-know-why-your-mdx-query-is-slow/ my MDX query is not optimal because the formula engine takes most of the time calculating the results.
At the moment I don't know why it's taking 8 seconds and what I can do to improve the performance.
The query should run withing preferably <= 3 seconds so the SSRS reports is available within an acceptable time.

The server has 4 vCores + 24GB Memory.

MDX Query

SELECT NON EMPTY
    {
        [Measures].[Aantal Prognose t/m], --YTD Measure
        [Measures].[Omzet Prognose t/m], --YTD Measure
        [Measures].[Aantal Prognose Vorig Jaar t/m], --PARALLELPERIOD Measure
        [Measures].[Omzet Prognose Vorig Jaar t/m] --PARALLELPERIOD Measure

    }
On 0,
NON EMPTY
    [Opendatum].[H - Openjaar - Openperiode].[Openperiode Omschrijving]
    *
    [Specialisme].[Specialisme].[Specialisme].Members
    *
    [Zorgactiviteit].[H - Tarieftypegroep - Tarieftype - Declaratie].[Declaratie].Members
    *
    [Zorgproduct].[Zorgproduct].[Zorgproduct].Members
    *
    [DBCDiagnose].[H - Diagnosegroep - Diagnose].[Diagnose].Members

ON 1
FROM Productie
WHERE
([Opendatum].[Openjaar Periode Nummer].&[201808],[Omzetprognosegroep].[Omzetprognose Groep1].&[Overige productie])

Defintion of the measures used in the MDX query:

Measure: "Aantal prognose t/m"

AGGREGATE(YTD([Opendatum].[H - Openjaar - Openperiode - Opendatum].CurrentMember), [Measures].[Aantal Prognose])

Measure: "Omzet prognose t/m"

AGGREGATE(YTD([Opendatum].[H - Openjaar - Openperiode - Opendatum].CurrentMember), [Measures].[Omzet Prognose])

Measure: "Aantal Prognose Vorig Jaar t/m"

(PARALLELPERIOD([Opendatum].[H - Openjaar - Openperiode - Opendatum].[Openjaar],1,[Opendatum].[H - Openjaar - Openperiode - Opendatum]),[Measures].[Aantal Prognose t/m])

Measure: "Omzet Prognose Vorig Jaar t/m"

(PARALLELPERIOD([Opendatum].[H - Openjaar - Openperiode - Opendatum].[Openjaar],1,[Opendatum].[H - Openjaar - Openperiode - Opendatum]),[Measures].[Omzet Prognose t/m])

   

Mdx query to join two cubes and get the measures and dimensions from both the cubes

$
0
0

Hi,

Please help me with how to join two cubes and get the data.

I tried using lookupcube but that takes lot of time to retrieve the data.

Is there any other way to get this done?

SSAS Tabular Calculation

$
0
0

Hi,

I am using ssas 2014 tabular in which I have a cost measure based on a location.

Ex

||Location||Cost||
|abc_India|20|
|final_india|20|
|bcd_china|20|
|final_china|20|

Basically cost is same for all the location which will be used as a dimension.

Now the requirement is to show final_india+final_china this is 40 by default as measure value but it should split in 4 values if I am dragging location in columns.

EX

COST=40 when i am not using location dimension

but when i am using location attribute in column it should slipt as

||Location||Cost||
|abc_India|20|
|final_india|20|
|bcd_china|20|
|final_china|20|

Viewing all 14337 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>