Friday, January 29, 2016

YTD and Previous YTD calculations

Regardless of what reasons you develop a OLAP DB, you almost always are going to have to deal with creating a YTD and/or previous YTD Calculation. I don't have to do these calculations a lot, which means I need to often scramble for the proper code and syntax to make this work. Having just recently did this for SSAS 2013, Here is the process for you and I.


For starters, make sure you have created the proper date dimensions with at the minimum (for best effect) a date and year column. Yes, this seems obviously but still it needs to be said. Now that we have that out of the way, the steps are as follows:
  1. Add Calendar hierarchy (Works best this way)
  2. Make sure Date Dimension is specified as Time in Dimension Properties
  3. Add Attribute Relationships (see my previous blog post on this)
Now to create YTD use the following (I know there is a YTD function but I have more success with this method) MDX calculation:

 AGGREGATE
(
 PERIODSTODATE
  (
    [Dim Date].[Calendar].[Year],
   [Dim Date].[Calendar].CurrentMember
  ),
([Measures].[Amount])
)

In my example I have a Date hierarchy called DimDate.  In the above code, I used  called this Measure YTD, not using that same calculation, I added it to a ParalellPeriod function to get the Previous year. The code is as follows:

SUM
(
ParallelPeriod
([Dim Date].[Calendar].[Year],
1,
[Dim Date].[Calendar].CurrentMember),[Measures].[YTD]

)


This should work as long as you have process the YTD measure and then created the Previous YTD calculation. Also, this requires the Measure to be aligned with an Year filter (you will see NULLs otherwise) to work.


Again, a common calculation but commonly done incorrectly. Hopefully this helps you.


Friday, November 20, 2015

Power BI DAX Calculations


I use Power BI a lot. Particularly lately I’ve been using the Power BI Desktop that Microsoft seems to be updating almost weekly (Just kidding, but it is on about a monthly update schedule).  When creating calculations, I find some of the most impactful calculations are YTD, Same Period Last Year, and % of Total.

To save you some time, here are the Measure Calculations.... 

·     % of a Total
 
% of Total  = SUM(DailySales[Net Sales]) / CALCULATE(SUM(DailySales[Net Sales]),ALLSELECTED()) 

·     Year to Date
Basically, we need to make sure the Date is format as a normal Date. 

Net Sales YTD = TOTALYTD(SUM(DailySales[Net Sales]),Dates[Date])

·     
            Sales of the Previous Year
Net Sales PY = CALCULATE(SUM(DailySales[Net Sales]), SAMEPERIODLASTYEAR(Dates[MonthYear])) 

Using these three expression you can easily modify these to accommodate other time intelligence calculations and percentages.So if you wanted to get the % change of the Previous Year, you can use the above measure as follows:
         % ChangeVSLastYr = ((SUM(DailySales[Net Sales]) / [Net Sales PY])) - 1

Friday, August 28, 2015

Find Column Names across a SQL Database


When working against databases that you’re not particularly familiar with and you don’t have a data dictionary (or any schema information) you often need to have a quick way to find column names in a database.  
Using this query I created, you can specify a value for the @FindColumn parameter and it will look for that column using the database of your choice. In this example, I looking for the 'clientkey' across the database ProdDB.
 
-- FIND  clientkey columns in SQL Database ProdDB--
 
Use ProdDB
Declare @FindColumn varchar(32)
SET @FindColumn = 'clientkey'

 
SELECT t.Name AS TableOrViewName, [type_desc] [Type],
SCHEMA_NAME(schema_id) AS [Schema],
c.name AS ColumnName
FROM sys.tables AS t
INNER JOIN sys.columns c ON t.OBJECT_ID = c.OBJECT_ID
WHERE c.name LIKE @FindColumn

UNION

SELECT v.name,type_desc, SCHEMA_NAME(schema_id) AS [Schema] , cl.name
 FROM sys.views v
 INNER JOIN sys.columns cl ON v.OBJECT_ID = cl.OBJECT_ID
 WHERE cl.name LIKE @FindColumn 

 ORDER BY [Schema], TableOrViewName

 

Monday, June 29, 2015

Common Date Calculations for Parameters and SQL variables


I’m commonly asked to make the parameters for the date start with specific months. I’m typically asked for its First Day Last Month or Last Day Previous Month. Here are the T-SQL and SSRS Expressions to create the dates:
Using T-SQL:
--First Day of Current Month
SELECT DATEADD(mm, DATEDIFF(mm, 0, GETDATE()), 0)

--Last Day of Current Month
SELECT DATEADD (dd, -1, DATEADD(mm, DATEDIFF(mm, 0, GETDATE()) + 1, 0))

 --First Day of Previous Month
SELECT DATEADD(mm, DATEDIFF(mm, 0, GETDATE()) - 1, 0)

--Last Day of Previous Month
SELECT DATEADD(DAY, -(DAY(GETDATE())), GETDATE())
 
When you need to create these directly in the SSRS Report. Use the following:
 
--First date of Current Month
=Today.AddDays(1-Today.Day)

 --last date of Current Month
=Today.AddDays(1-Today.Day).AddMonths(1).AddDays(-1)

 --First date of Current Year
=Today.AddMonths(1-Today.month).AddDays(1-Today.day)

 --Last date of Current Year
=Today.AddDays(1-Today.Day).AddMonths(13-today.month).AddDays(-1)

 --Previous Month Last Date
=DateAdd("d",-(Day(today)), Today)

--Previous Month First Date
=dateadd(dateinterval.month, -1, today().AddDays(-(today().Day-1)))

Monday, April 6, 2015

Creating Environments for Testing

I’ve said during rants about career options, building a lab to test is essential to your development as an IT professional. I‘ve pointed out where you can find a treasure-trove of IT books (http://it-ebooks.info/) and also where you can get free training (Microsoft Virtual Academy).  Using these two site resources you get the books and the training, now you can use the Microsoft Virtual Labs to get to play around with the technologies you read and learned.

Now the third part of your learning process should stem around practicing what you have learned. I’ve said in previous posts that you should establish a lab to test learned technologies so that you when you need to execute them in the real world, you’ll be more effective. Testing requires having some lab environment that mimics the real world. Unfortunately, this is where you can run into a few challenges trying to establish the appropriate environment. This is why I recommend using the Microsoft Virtual Labs - https://technet.microsoft.com/en-us/virtuallabs

Using Microsoft Virtual Labs

The main advantage is that is provides a ready-made environment so that you can practice what you learned.  You don’t have to spend any time setting up a test environment which is often the most laborious and costly side of testing. I’ve worked in support for a long time (too much in my opinion) and the need to have available environment to test customer issues that critical to understanding a resolving customer issues.
It can be costly and very time consuming to have enterprise solutions like SharePoint is testing format. Even if you are using virtual solutions (Hyper-V or VMWare), this is still an expensive technology because it takes lots of disk space and memory (RAM). This does not include the processing power needed to effectively run a “usable” installation. Sure you can get by using these methods with a simple SQL solutions or even a barebones SharePoint environment, but those are often not the best environment to test solutions that are of the enterprise variety.
One of the only downsides is that each virtual lab environment is that is expires in 90 minutes. That may not seem like a long time but it is generally sufficient to test when the all configuration has been done. The obviously con is that if you build out a lab environment with a lot of configuration that will be lost.  Therefore, it’s important that you factor that restriction in your testing. This is for testing issues that can be reproduced quickly or for learning and not for testing solutions that require hours to produce.

 

Friday, March 6, 2015

Creating a Azure SQL Server Reporting Portal for Public Usage

I love using Windows Azure.

This technology saves you a lot of time when it comes to deploying Microsoft server technology. No longer do we have get a physical server, install all the appropriate hardware, install the all the software, configure the software, and not to mention the security, change control, governance, etc. and all that is required to get just the server up and running.  With Windows Azure, you simply need to just open up the Microsoft Azure portal and create the appropriate virtual machine and pick the required template. Tasks that used to take day can now be done in an hour.
Making your SQL Azure server available for Internet users essentially consists of the following two tasks:
  1. Opening HTTP/HTTPS endpoint for your VM in Azure management portal (or during the installation of the VM)
  2. Establish a firewall port on the VM (After the VM has been installed)
The process is as follows...

First we have to create a Virtual Machine for the SQL Server. You can use any SQL Server version as all have the SQL Reporting Services components installed.
 


Next, when creating your VM, machine sure you add the option to connect via HTTP. This is done by adding the HTTP endpoint. You can specify a Private and Public port (80 is the default for HTTP).
 


Continue until the provision completes (i.e. Creating Virtual Machine,). Depending on the size of the VM (in this example I’m creating a SQL Server Web Server), it should take about 20 minutes to an hour. You have to establish the SQL Reporting Services as by default, the URL, Manager, and the Reporting database is not installed (version SQL 2014).
Open Windows PowerShell on your Virtual Machine and open a port so this VM is available for Internet users.
                 i.          Open a command windows (cmd.exe)
                ii.          When the command widows opens, run PowerShell (Powershell.exe)
               iii.          Run the following command via PowerShell on the Virtual Machine:

New-NetFirewallRule -DisplayName “Report Server Name”-Direction Inbound –Protocol TCP –LocalPort 80

Open the SQL Services Reporting Configuration Manager to now configure the SQL Reporting Service. First open up the tool and Connect to the existing SQL Server. Next establish the SSRS Web Service URL…

 
Next establish the SSRS Web Service URL…
 
Make sure you create the database next. Follow the wizard and input the required information. This is a straightforward process that will either create a new database or use an existing one. I would suggest creating a new one as it is a lot easier.


Next create the actually SSRS Reporting Portal that you will typically use to upload your reports.
 


Once you have enabled / installed the SQL Reporting Database, the Report Server websites, you can now connect directly via a browser to the server. You will have to use the fully qualified domain name (FQDN) as follows:

<Servername>.cloudapp.net/reports  (*.cloudapp.net is the default Azure DNS)

You can now have users connect to the reports, develop reports and use other SQL Reporting tasks that you normally accomplish via a SQL Server on your premises.

 

Creating PowerPivot data sources


As I have demonstrated there are a lot of things about PowerBI to get excited about. One of these is Microsoft’s adaption of the technology xVelocity. The true power behind PowerPivot is the xVelocity in-memory analytics engine.
Why XVelocity is important

Before Microsoft’s introduction of xVelocity most of the report offerings was via SSRS, ASP.NET, and Performance Point Services. SSRS an ASP.Net uses a lightweight development model. These visualization tools relied on data sources like SQL Server that traditionally was stored on hard disks.  Hard disks are relatively slow. Whenever system must use resources stored on disk, this will impact the response time of the report or dashboard. This is why Microsoft now uses xVelocity.  xVelocity gives you two very important things, faster dataset creation (via PowerPivot) and the faster response from visualization tools that rely on these datasets.
PowerPivot as a data source
The power to create PowerPivot data sources cannot be understated. With this technology you can create a centralized data source that encompasses various data sources. You can create a PowerPivot data source that is comprised of text files, Excel worksheets, and database tables into one data source. In my example, I’m going to demonstrate how to do this with SQL Server. PowerPivot data source differ from others because its dataset is stored entirely in memory. Again, this is due to the xVelocity technology.

PowerPivot is essentially the database “Tabular” model

Tabular model is a database design much like OLAP Cubes. However, Tabular cannot totally replace MDX cubes. The main reason is due to the memory requirements for tabular. Tabular models must be stored in memory while MDX can be stored on physical disks, essentially allowing an extremely large data sources. Unfortunately, the MDX language is more difficult to use and the OLAP cube development can be much more frustrating.  DAX, which is the expression language of Tabular/PowerPivot data sources, is easier to use and understand. Excel users will find DAX easy to adapt because of its familiarity to Excel expressions.