Google Track
Friday, March 30, 2012
Rafal Lukawiecki in SQL Server 2012 official launch in Oslo
First, thanks for invitation from Microsoft, specially Thale Mjavatn and nice organization of such an important event. The details for the event you find in this blog:
http://biblogg.no/2012/03/15/lansering-av-microsoft-sql2012/
and the agenda of the conference is here: http://www.microsoft.com/norge/bi-sql-fagdag/index.html
Rafal was fantastic as always and did a remarkable job explaining to the audience the new technologies behind MS SQL 2012, special focus on the Business Intelligence enhancement tools included. We got introduced to 2 new technologies: PowerView and BISM and little changes on what PowerPivot is now. Also, he spoke about trends where BI and Big Data will be the most important things in the future.
Once more thank you for the opportunity and thanks to all who were part of it.
Labels:
analytics,
bi,
Big,
BISM,
data,
Lukawiecki,
mining,
pivot,
power,
power view,
PowerPivot,
Predictive,
Rafal
Wednesday, March 28, 2012
BI News from Panorama
Sort through data with business intelligence
With endless information accessible in today's big data sprawl, it is often difficult for businesses to sort through data that is valuable and isn't without outside assistance. Because of this, more and more organizations are turning to business intelligence in order to sort through vast reams of data to develop concrete analytics.
A new report by DSquared Media provides valuable insight into the worth of adopting business intelligence software and infrastructure. According to the study, for every $1.00 spent on business analytics, $10.66 was yielded in returns. Furthermore, 74 percent of organizatons who manually assembled data from various sources negatively affected daily operations.
The report also found that many large corporations used BI in developmental years in order to become the giants they are today. For example, Febreze used a marketing campaign aided by BI when first released, and now sales total over $1 billion a year. In addition, Target used marketing campaigns aided by BI and revenues grew from $44 billion in 2002 to $67 billion in 2010.
The study found that there was an assortment of reasons why people were influenced by BI. Ninety-five percent of respondents found that they were influenced by BI for its ability to increase insight into operations. Furthermore, 85 percent found business intelligence provided faster process and reporting cycle time, while only 48 percent were influenced by regulatory compliance.
With endless information accessible in today's big data sprawl, it is often difficult for businesses to sort through data that is valuable and isn't without outside assistance. Because of this, more and more organizations are turning to business intelligence in order to sort through vast reams of data to develop concrete analytics.
A new report by DSquared Media provides valuable insight into the worth of adopting business intelligence software and infrastructure. According to the study, for every $1.00 spent on business analytics, $10.66 was yielded in returns. Furthermore, 74 percent of organizatons who manually assembled data from various sources negatively affected daily operations.
The report also found that many large corporations used BI in developmental years in order to become the giants they are today. For example, Febreze used a marketing campaign aided by BI when first released, and now sales total over $1 billion a year. In addition, Target used marketing campaigns aided by BI and revenues grew from $44 billion in 2002 to $67 billion in 2010.
The study found that there was an assortment of reasons why people were influenced by BI. Ninety-five percent of respondents found that they were influenced by BI for its ability to increase insight into operations. Furthermore, 85 percent found business intelligence provided faster process and reporting cycle time, while only 48 percent were influenced by regulatory compliance.
Labels:
analytics,
business,
churn,
coorporates,
data,
information,
intelligence,
organizations,
prediction,
Predictive,
yield
Sunday, March 25, 2012
Avarage Calculations in MDX
Average is very important function to calculate Key Figures of your business. Here are some samples of Average-Avg(), starting with basic to more advanced calculations like deep diving into Moving Averages :
WITH
MEMBER [Measures].AvgProductSales AS
Avg
(
NonEmpty
(
[Product].[Product].[Product].MEMBERS
,[Measures].[Sales Amount]
)
,[Measures].[Sales Amount]
)
SELECT
[Measures].AvgProductSales ON 0
,[Date].[Date].[Date].MEMBERS ON 1
FROM [Adventure Works];
WITH
MEMBER [Measures].AvgProductSales AS
Avg
(
[Product].[Product].[Product].MEMBERS
,[Measures].[Sales Amount]
)
SELECT
[Measures].AvgProductSales ON 0
,[Date].[Date].[Date].MEMBERS ON 1
FROM [Adventure Works];
WITH MEMBER [Measures].AvgProductSales
AS Avg(EXISTING [Product].[Product].[Product].MEMBERS, [Measures].[Sales Amount])
SELECT {[Product].[Product Categories].[Subcategory].[Bike Racks]
,[Product].[Product Categories].[Subcategory].[Bike Stands]
} ON 0
, [Date].[Date].[Date].MEMBERS ON 1
FROM [Adventure Works]
WHERE [Measures].AvgProductSales
;
// Avg over Filter
WITH
MEMBER [Measures].AvgGrowingProducts AS
Avg
(
Filter
(
[Product].[Product].[Product].MEMBERS
,[Measures].[Sales Amount] > ([Measures].[Sales Amount],ParallelPeriod([Date].[Calendar].[Month]))
)
,[Measures].[Sales Amount]
)
SELECT
[Measures].AvgGrowingProducts ON 0
,Descendants
(
[Date].[Calendar].[Calendar Year].&[2003]
,[Date].[Calendar].[Date]
) ON 1
FROM [Adventure Works];
// Correct Avg over Filter
WITH
MEMBER [Measures].Growth AS
IIF
(
[Measures].[Sales Amount] > ([Measures].[Sales Amount] ,ParallelPeriod([Date].[Calendar].[Month]))
,[Measures].[Sales Amount]
,NULL
)
,FORMAT_STRING = 'Currency'
MEMBER [Measures].AvgGrowingProducts AS
Avg
(
[Product].[Product].[Product].MEMBERS
,[Measures].Growth
)
SELECT
[Measures].AvgGrowingProducts ON 0
,Descendants
(
[Date].[Calendar].[Calendar Year].&[2003]
,[Date].[Calendar].[Date]
) ON 1
FROM [Adventure Works];
//
WITH MEMBER [Measures].AvgByDayOfWeek AS
Avg(
Exists(
NULL:[Date].[Date].CurrentMember,
[Date].[Day of Week].CurrentMember)
, [Measures].[Sales Amount])
SELECT [Measures].AvgByDayOfWeek ON 0
--, [Product].[Product].[Product].MEMBERS ON 1
, [Date].[Date].[Date].MEMBERS ON 1
FROM [Adventure Works]
;
WITH MEMBER [Measures].AvgByDayOfWeek AS
Avg(
Nest(
NULL:[Date].[Date].CurrentMember,
[Date].[Day of Week].CurrentMember)
, [Measures].[Sales Amount])
SELECT [Measures].AvgByDayOfWeek ON 0
--, [Product].[Product].[Product].MEMBERS ON 1
, [Date].[Date].[Date].MEMBERS ON 1
FROM [Adventure Works]
;
// ===========
// Running sum
// ===========
// Typical approach for running sum
WITH
MEMBER [Measures].RunSales AS
Sum
(
NULL : [Date].[Date].CurrentMember
,[Measures].[Sales Amount]
)
SELECT
[Measures].RunSales ON 0
,[Date].[Date].[Date].MEMBERS ON 1
FROM [Adventure Works];
// Trying to optimize...
WITH
MEMBER [Measures].RunSales AS
Sum
(
Union
(
NULL : [Date].[Calendar].Parent.PrevMember
,
[Date].[Calendar].FirstSibling : [Date].[Calendar].CurrentMember
)
,[Measures].[Sales Amount]
)
SELECT
[Measures].RunSales ON 0
,[Date].[Date].[Date].MEMBERS ON 1
FROM [Adventure Works];
// Forcing back into block mode
WITH
MEMBER [Measures].RunSales AS
Sum
(
NULL : [Date].[Calendar].Parent.PrevMember
,[Measures].[Sales Amount]
)
+
Sum
(
[Date].[Calendar].FirstSibling : [Date].[Calendar].CurrentMember
,[Measures].[Sales Amount]
)
SELECT
[Measures].RunSales ON 0
,[Date].[Date].[Date].MEMBERS ON 1
FROM [Adventure Works];
// Better caching...
WITH
MEMBER [Measures].RunMonthSales AS
Sum
(
NULL : [Date].[Calendar].CurrentMember
,[Measures].[Sales Amount]
)
MEMBER [Measures].RunSales AS
([Measures].RunMonthSales, [Date].[Calendar].Parent.PrevMember)
+
Sum
(
[Date].[Calendar].FirstSibling : [Date].[Calendar].CurrentMember
,[Measures].[Sales Amount]
)
SELECT
[Measures].RunSales ON 0
,[Date].[Date].[Date].MEMBERS ON 1
FROM [Adventure Works];
WITH
MEMBER [Measures].AvgProductSales AS
Avg
(
NonEmpty
(
[Product].[Product].[Product].MEMBERS
,[Measures].[Sales Amount]
)
,[Measures].[Sales Amount]
)
SELECT
[Measures].AvgProductSales ON 0
,[Date].[Date].[Date].MEMBERS ON 1
FROM [Adventure Works];
WITH
MEMBER [Measures].AvgProductSales AS
Avg
(
[Product].[Product].[Product].MEMBERS
,[Measures].[Sales Amount]
)
SELECT
[Measures].AvgProductSales ON 0
,[Date].[Date].[Date].MEMBERS ON 1
FROM [Adventure Works];
WITH MEMBER [Measures].AvgProductSales
AS Avg(EXISTING [Product].[Product].[Product].MEMBERS, [Measures].[Sales Amount])
SELECT {[Product].[Product Categories].[Subcategory].[Bike Racks]
,[Product].[Product Categories].[Subcategory].[Bike Stands]
} ON 0
, [Date].[Date].[Date].MEMBERS ON 1
FROM [Adventure Works]
WHERE [Measures].AvgProductSales
;
// Avg over Filter
WITH
MEMBER [Measures].AvgGrowingProducts AS
Avg
(
Filter
(
[Product].[Product].[Product].MEMBERS
,[Measures].[Sales Amount] > ([Measures].[Sales Amount],ParallelPeriod([Date].[Calendar].[Month]))
)
,[Measures].[Sales Amount]
)
SELECT
[Measures].AvgGrowingProducts ON 0
,Descendants
(
[Date].[Calendar].[Calendar Year].&[2003]
,[Date].[Calendar].[Date]
) ON 1
FROM [Adventure Works];
// Correct Avg over Filter
WITH
MEMBER [Measures].Growth AS
IIF
(
[Measures].[Sales Amount] > ([Measures].[Sales Amount] ,ParallelPeriod([Date].[Calendar].[Month]))
,[Measures].[Sales Amount]
,NULL
)
,FORMAT_STRING = 'Currency'
MEMBER [Measures].AvgGrowingProducts AS
Avg
(
[Product].[Product].[Product].MEMBERS
,[Measures].Growth
)
SELECT
[Measures].AvgGrowingProducts ON 0
,Descendants
(
[Date].[Calendar].[Calendar Year].&[2003]
,[Date].[Calendar].[Date]
) ON 1
FROM [Adventure Works];
//
WITH MEMBER [Measures].AvgByDayOfWeek AS
Avg(
Exists(
NULL:[Date].[Date].CurrentMember,
[Date].[Day of Week].CurrentMember)
, [Measures].[Sales Amount])
SELECT [Measures].AvgByDayOfWeek ON 0
--, [Product].[Product].[Product].MEMBERS ON 1
, [Date].[Date].[Date].MEMBERS ON 1
FROM [Adventure Works]
;
WITH MEMBER [Measures].AvgByDayOfWeek AS
Avg(
Nest(
NULL:[Date].[Date].CurrentMember,
[Date].[Day of Week].CurrentMember)
, [Measures].[Sales Amount])
SELECT [Measures].AvgByDayOfWeek ON 0
--, [Product].[Product].[Product].MEMBERS ON 1
, [Date].[Date].[Date].MEMBERS ON 1
FROM [Adventure Works]
;
// ===========
// Running sum
// ===========
// Typical approach for running sum
WITH
MEMBER [Measures].RunSales AS
Sum
(
NULL : [Date].[Date].CurrentMember
,[Measures].[Sales Amount]
)
SELECT
[Measures].RunSales ON 0
,[Date].[Date].[Date].MEMBERS ON 1
FROM [Adventure Works];
// Trying to optimize...
WITH
MEMBER [Measures].RunSales AS
Sum
(
Union
(
NULL : [Date].[Calendar].Parent.PrevMember
,
[Date].[Calendar].FirstSibling : [Date].[Calendar].CurrentMember
)
,[Measures].[Sales Amount]
)
SELECT
[Measures].RunSales ON 0
,[Date].[Date].[Date].MEMBERS ON 1
FROM [Adventure Works];
// Forcing back into block mode
WITH
MEMBER [Measures].RunSales AS
Sum
(
NULL : [Date].[Calendar].Parent.PrevMember
,[Measures].[Sales Amount]
)
+
Sum
(
[Date].[Calendar].FirstSibling : [Date].[Calendar].CurrentMember
,[Measures].[Sales Amount]
)
SELECT
[Measures].RunSales ON 0
,[Date].[Date].[Date].MEMBERS ON 1
FROM [Adventure Works];
// Better caching...
WITH
MEMBER [Measures].RunMonthSales AS
Sum
(
NULL : [Date].[Calendar].CurrentMember
,[Measures].[Sales Amount]
)
MEMBER [Measures].RunSales AS
([Measures].RunMonthSales, [Date].[Calendar].Parent.PrevMember)
+
Sum
(
[Date].[Calendar].FirstSibling : [Date].[Calendar].CurrentMember
,[Measures].[Sales Amount]
)
SELECT
[Measures].RunSales ON 0
,[Date].[Date].[Date].MEMBERS ON 1
FROM [Adventure Works];
Labels:
average,
avg,
Business intelligence,
calculated,
deep,
diving,
figures,
key,
kpi,
mdx,
member,
moving
Basic Calculations in MDX
// Connect to 2005 Adventure Works
// Gross margin using 1-a/b
WITH
MEMBER [Gross Margin] AS
1
-
[Measures].[Internet Total Product Cost]
/
[Measures].[Internet Sales Amount]
, FORMAT_STRING = 'Percent'
MEMBER [Max Gross Margin] AS
Max
(
(
[Customer].[Customer Geography].[Customer]
,[Product].[Product Categories].[Subcategory]
,[Date].[Calendar].[Calendar Year]
)
,[Measures].[Gross Margin]
)
SELECT
[Measures].[Max Gross Margin] ON 0
FROM [Adventure Works];
// Gross margin using (b-a)/b
WITH
MEMBER [Gross Margin] AS
(
[Measures].[Internet Sales Amount]
-
[Measures].[Internet Total Product Cost]
)
/
[Measures].[Internet Sales Amount]
, FORMAT_STRING = 'Percent'
MEMBER [Max Gross Margin] AS
Max
(
(
[Customer].[Customer Geography].[Customer]
,[Product].[Product Categories].[Subcategory]
,[Date].[Calendar].[Calendar Year]
)
,[Measures].[Gross Margin]
)
SELECT
[Measures].[Max Gross Margin] ON 0
FROM [Adventure Works];
// Gross margin using 1-a/b
WITH
MEMBER [Gross Margin] AS
1
-
[Measures].[Internet Total Product Cost]
/
[Measures].[Internet Sales Amount]
, FORMAT_STRING = 'Percent'
MEMBER [Max Gross Margin] AS
Max
(
(
[Customer].[Customer Geography].[Customer]
,[Product].[Product Categories].[Subcategory]
,[Date].[Calendar].[Calendar Year]
)
,[Measures].[Gross Margin]
)
SELECT
[Measures].[Max Gross Margin] ON 0
FROM [Adventure Works];
// Gross margin using (b-a)/b
WITH
MEMBER [Gross Margin] AS
(
[Measures].[Internet Sales Amount]
-
[Measures].[Internet Total Product Cost]
)
/
[Measures].[Internet Sales Amount]
, FORMAT_STRING = 'Percent'
MEMBER [Max Gross Margin] AS
Max
(
(
[Customer].[Customer Geography].[Customer]
,[Product].[Product Categories].[Subcategory]
,[Date].[Calendar].[Calendar Year]
)
,[Measures].[Gross Margin]
)
SELECT
[Measures].[Max Gross Margin] ON 0
FROM [Adventure Works];
Labels:
Calculated member,
Calculation,
function,
gross,
margin,
margins,
max,
mdx,
min
Saturday, March 24, 2012
Industry News
Cloud computing changing future of BI
2012-03-23With the onset of cloud technology, many different sectors of the business and personal world are rapidly changing. From the focus on personal computers to mobile tablets, and from legacy systems to IaaS and SaaS systems, it goes without saying that in the coming years consumers can expect a technological revolution.
All these emerging systems are impacting business intelligence as well. A recent IDC study predicts that the market for big data technology and business intelligence software will grow from $3.2 billion in 2010 to $16.9 billion in 2015.
Furthermore, Gartner predicts that companies will nearly be forced into using these new technologies or risk losing a competitive edge. According to a new Gartner study, 85 percent of Fortune 500 corporations will fail to effectively use big data to get an advantage.
According to Jeff Kaplan, managing director of THINKstrategies, businesses must adopt some semblance of business intelligence and analytics software in order to maintain an edge that may have previously been established with legacy systems.
"Without all these cloud-based resources and tools, most organizations would be unable to cope with today's explosive growth of data," he said.
Labels:
azure,
bi,
Business intelligence,
Cloud,
computing,
future,
Gartner,
news,
smartphones,
sql,
tablets
Friday, March 23, 2012
The Future of Mobiles
How the future of mobile phones will be shaped by human and business request. What is going to happen
with smartphones and what with dumbphones? These are the questions that this embed presentation from Business Insider aims to answer. ENJOY!
with smartphones and what with dumbphones? These are the questions that this embed presentation from Business Insider aims to answer. ENJOY!
Labels:
android,
apple,
business,
cognos,
data,
etl,
ibm,
intelligence,
ipad,
iphone,
jaspersoft,
microsoft,
mobile,
opensource,
phone,
reports,
windows
Tuesday, March 20, 2012
Analytics in Sports
I am fan of football and my favorite team is FC Barcelona. Combining sports, specially football with Analytics is just amazing.
Labels:
analysis,
analytics,
football,
game,
games,
match,
prediction,
Predictive,
soccer,
sport,
sports,
theory
Mike Walsh, futurist
In our VK2012 was invited Mike Walsh and he had a keynote about technology future. He instisted that the future of the World is DATA and claimed the most important profession of the future will be Data Scientist. Let the future begin and let say I am a Data Scientist.
Labels:
analysis,
analytics,
Business intelligence,
data,
future,
futurist,
math,
Mike Wlash,
model,
scientist,
technology
Thursday, March 15, 2012
Universe in your table
Microsoft Surface brings the Universe in your desk
BI for Customers
BI for everyone, does it sound familiar!
It is a fact that Business Intelligence was dedicated to big companies, enterprises because they have that amount of data to be considered interesting for analytics and BI. Now, Gartner started the idea of Bi for mid-size and small businesses, so they need attention too based on BI surveys. But, have you ever thought for a BI solution in Customer Level, or more detailed do you think you can handle a personal BI solution.
ELA will give you the answer.
ELA solution for Customer Intelligence
What ELA is actually?
Elegant Analytics represents the name of a general BI solution in or group of methodologies in Analytics that adapts to every Business profile. In this case, ELA will provide solution for personal finance and planning of your budget. The name of the product is PFI (Personnal Finance Intelligence). Inspired by the TV Show “Luksusfellen” here in Norway, this BI end-user tool may be a solution for all these who fail to maintain well their own economy and for those who want to perform their economy as well. The purpose of this project is to create a Customer Analytical Cube that would process data for each bank costumer using his/her history for its own benefit and then answer you most important queries that users do against their own data.
This solution will include also benchmarking against an Imaginary subject (Ola Nordman) that can be Min, Max or Avg of the customer’s measures in a certain region, for a period of time, similar age group, sex and income levels.
For having more controle and planning your own economy, will be an extra parameter as Target, so users (bank customers) will put their targets for costs and income a month, quarter or a year ahead and always will be warned when they are about to achieve the amount they targeted.
If you want to read more then follow the link where you can download the full project.
It is a fact that Business Intelligence was dedicated to big companies, enterprises because they have that amount of data to be considered interesting for analytics and BI. Now, Gartner started the idea of Bi for mid-size and small businesses, so they need attention too based on BI surveys. But, have you ever thought for a BI solution in Customer Level, or more detailed do you think you can handle a personal BI solution.
ELA will give you the answer.
ELA solution for Customer Intelligence
What ELA is actually?
Elegant Analytics represents the name of a general BI solution in or group of methodologies in Analytics that adapts to every Business profile. In this case, ELA will provide solution for personal finance and planning of your budget. The name of the product is PFI (Personnal Finance Intelligence). Inspired by the TV Show “Luksusfellen” here in Norway, this BI end-user tool may be a solution for all these who fail to maintain well their own economy and for those who want to perform their economy as well. The purpose of this project is to create a Customer Analytical Cube that would process data for each bank costumer using his/her history for its own benefit and then answer you most important queries that users do against their own data.
This solution will include also benchmarking against an Imaginary subject (Ola Nordman) that can be Min, Max or Avg of the customer’s measures in a certain region, for a period of time, similar age group, sex and income levels.
For having more controle and planning your own economy, will be an extra parameter as Target, so users (bank customers) will put their targets for costs and income a month, quarter or a year ahead and always will be warned when they are about to achieve the amount they targeted.
If you want to read more then follow the link where you can download the full project.
Labels:
analytics,
Bank,
banks,
business,
Customer,
economy,
fellen,
finance,
intelligence,
Luksus,
luksusfellen,
model,
personal,
Predictive,
problems,
trouble
Tuesday, March 13, 2012
Picture worth 1000 words
Labels:
analysis,
analytics,
business,
Business Intelligence Development and Maintenance,
data,
decision,
demo,
information,
intelligence,
managment,
math,
mdx,
measures
Monday, March 12, 2012
Another approach to Time Intelligence
Best Parctices for Time Scale solution
This is an original method that I use when I build SSAS Cubes and it is time to share it with you
Time Intelligence is a common issue for every OLAP structure because Time as dimension apperars in every OLAP project, in every Cube you build, despite business model or type. To handle Time Intelligence good in calculations, aggregations and optimization, you need to use Timescale as well. With Timescale I mean: MonthToDate (MTD), YearToDate (YTD), LastYear(LY) etc..., all these very important to everyday use of Business Intelligence solutions.
Now I will take to technical steps to implement this genius way of handling with Timescale and Time calculations.
First you create a Table for Timescale in the source (in you DB, DWH or Data Mart), with 3-4 columns and 3-4 records for example. Here is a sample for that:
Based on this table you create a Dimension Timescale where Columns are Attributes and Records are Members of that Dimension.
After that, I go to DSV of our Cube, on every Fact Table that needs Timescale (usually all need Timescale) I add Named Calculation FK_Timescale with value 'PE', as in the image above:
I create a relationship FK_Timescale of the Fact Table to Timescale table in Data Source View (DSV) and after I build the Cube I do the same in Dimension Usage, where I create Regular relation between Fact Table and Timescale Dimension as shown above:
I create 2 name sets for MTD and YTD right after Calculate; and the MDX for that is shown above :
CALCULATE;
-- Period to date
-- Month to Date
[Timescale].[Timescale].[MTD] = Sum(MTD([Time Dim].[Hierarchy].CurrentMember),[Timescale].[Timescale].[PE]);
-- Year to Date
[Timescale].[Timescale].[YTD] = Sum(YTD([Time Dim].[Hierarchy].CurrentMember),[Timescale].[Timescale].[PE]);
Now, you have ready implemmented Timescale in the Cube for all your Measures, so you do not need to calculate Timescales for each Measure. Instead of having MTD(YTD) Revenue, you just use Revenue measure and change Timescale from PE to MTD(YTD). Test this with Excel, through Data Connection to OLAP Cube and enjoy possibilities. This way is proven more dynamic, flexible and optimized for query performance.
I would be very pleased and that will help me keeping posting good things about BI, if you find time from your busy schedule to suggest, critic or to share with love this blog or this particular content.
Regards,
Besim
Labels:
Business intelligence,
Calculated member,
Calculation,
dimension,
dimensions,
excel,
implemmenation,
intelligence,
mdx,
measures,
Named Set,
office,
pivot,
sets,
sharepoint,
ssas,
Time,
Timescale
Friday, March 9, 2012
TS: Microsoft SQL Server 2008, Business Intelligence Development and Maintenance
TS: Microsoft SQL Server 2008, Business Intelligence Development and Maintenance
Skills Being Measured
This exam measures your ability to accomplish the technical tasks listed below. The percentages indicate the relative weight of each major topic area on the exam.The higher the percentage, the more questions you are likely to see on that content area on the exam. The information after “This objective may include but is not limited to” is intended to further define or scope the objective by describing the types of skills and topics that may be tested for the objective. However, it is not an exhaustive list of skills and topics that could be included on the exam for a given skill area. You may be tested on other skills and topics related to the objective that are not explicitly listed here.
Implementing an SSIS Solution (17%)
•Implement control flow. This objective may include but is not limited to: checkpoints; debug control flow; transactions; implement the appropriate control flow task to solve a problem; data profiling and quality
•Implement data flow. This objective may include but is not limited to: debug data flow; implement the appropriate data flow components
•Implement dynamic package behavior by using property expressions. •Implement package logic by using variables. This objective may include but is not limited to: system variables; user variables; variable scope
•Implement package configurations.
•Implement auditing, logging, and event handling. This objective may include but is not limited to: use system variables for auditing; use event handlers; propagate events; use log providers; data profiling
•Extend SSIS packages by using .NET code.
This objective may include but is not limited to: use the script task; use the script component; use custom assemblies Configuring, Deploying, and Maintaining SSIS (15%)
•Install and maintain SSIS components. This objective may include but is not limited to: implement disaster recovery for SSIS
•Deploy an SSIS solution. This objective may include but is not limited to: deploy SSIS packages by using DTUTIL; deploy SSIS packages by using the deployment utility; deploy SSIS packages to SQL or file system locations
•Manage SSIS package execution. This objective may include but is not limited to: schedule package execution by using SQL Server Agent; execute packages by using DTEXEC; execute packages by using SQL Server Management Studio; execute packages by using the SSIS .NET API
•Configure SSIS security settings. This objective may include but is not limited to: MSDB database roles; package protection levels
•Identify and resolve issues related to SSIS solution deployment.
This objective may include but is not limited to: validate deployed packages; deploy packages and dependencies between servers Implementing an SSAS Solution (21%)
•Implement dimensions in a cube. This objective may include but is not limited to: translations; attribute relations; hierarchies
•Implement measures in a cube. This objective may include but is not limited to: measure groups •Implement a data source view. This objective may include but is not limited to: named calculations; named queries
•Configure dimension usage in a cube. This objective may include but is not limited to: implement reference dimensions; implement many to many relationships; implement fact relationships; implement role-playing relationships; define granularity
•Implement custom logic in a cube by using MDX. This objective may include but is not limited to: actions; key performance indicators (KPI); calculated members; calculations
•Implement data mining.
This objective may include but is not limited to: implement data mining structures and models; query data mining structures by using DMX; data mining views
•Implement storage design in a cube. This objective may include but is not limited to: aggregations; partitions; storage modes; proactive caching Configuring, Deploying, and Maintaining SSAS (17%)
•Configure permissions and roles in SSAS. This objective may include but is not limited to: server roles; SSAS database roles; cube roles; enable client application access; implement custom access to data
•Deploy SSAS databases and objects. This objective may include but is not limited to: Deployment Wizard; BIDS; SSMS; SSIS Analysis Services Execute DDL task
•Install and maintain an SSAS instance. This objective may include but is not limited to: disaster recovery
•Diagnose and resolve performance issues. This objective may include but is not limited to: use SQL Profiler; performance monitor counters; DMVs; Usage Based Optimization Wizard
•Implement processing options. Implementing an SSRS Solution (17%)
•Implement report data sources and datasets. This objective may include but is not limited to: query types; dynamic data sources; filter location (dataset vs. query)
•Implement a report layout.
This objective may include but is not limited to: apply conditional formatting; page configuration; headers and footers
•Extend an SSRS solution by using code. This objective may include but is not limited to: custom .NET assembly; private code
•Create an SSRS report by using an SSAS data source. This objective may include but is not limited to: MDX in an SSRS report; DMX in an SSRS report
•Implement report parameters. This objective may include but is not limited to: databound parameters; multi-value parameters
•Implement interactivity in a report. This objective may include but is not limited to: drilldown; drillthrough; interactive sorting
•Implement report items.
This objective may include but is not limited to: matrix; table; chart; image; list; grouping
•Embed SSRS reports in custom applications. This objective may include but is not limited to: use the Windows Forms Report Viewer; use the Web Forms Report Viewer; use the SSRS Web service Configuring, Deploying, and Maintaining SSRS (13%)
•Configure report execution and delivery. This objective may include but is not limited to: subscriptions; report caching; schedules; snapshot history
•Install and configure SSRS instances. This objective may include but is not limited to: deploy an SSRS Web farm
•Configure authentication and authorization for a reporting solution. This objective may include but is not limited to: configure server-level and item-level role-based security; configure Windows authentication and custom authentication
•Deploy an SSRS solution. This objective may include but is not limited to: RS.exe scripts; Report Builder; BIDS
•Configure SSRS availability. This objective may include but is not limited to: key management; migrate SSRS databases
Labels:
70-448,
besim,
Business Intelligence Development and Maintenance,
certification,
ismaili,
mcp,
microsoft,
TS: Microsoft SQL Server 2008
BIandIT.com, a website that is going to have all important things in BI
My web site is Under Construction and is going to have the hotest topics and trends in Business Intelligence.
You can find elegant solutions in Business Intelligence, special calculated members, Time Intelligence, Customer Intelligence, Competition Intelligence, Market Intelligence, Spatial Intelligence and Predictive Analytics. The site will have also Office Templates for Business Start-ups, for Market Analysis, ROI, First Year Costs etc...
www.biandit.com is comming SOON :)
Labels:
best practices,
BIDS,
business,
Business intelligence,
data,
dmx,
mathematical,
mdx,
microsoft,
mining,
model,
ms sql,
prdictive analytics,
ssas,
web
How I do Predictive Analytics without Data Mining?
This hot topic is comming soon.
I will just describe a bit what will this topic will include.
This topic is going to show how to do Predictive Analytics on your data without using Data Mining or DMX and just using
MDX. How can Data Mining prediction Algorithms be "translated" from DMX to MDX. How accurate are they and what is the benefit of using MDX?
Labels:
analytics,
BIDS,
Business intelligence,
data,
decision,
dmx,
forecasting,
intelligence,
managment,
mdx,
mining,
model,
Predictive,
ssas,
statistics
Wednesday, March 7, 2012
MDX Studion Online
Mosha Pasumansky, father of OLAP and MDX
Mosha Pasumansky is one of the inventors of the MultiDimensional eXpressions (MDX) language, a query language for online analytical processing (OLAP) databases. Pasumansky is also one of the architects of the Microsoft Analysis Services, and an OLAP expert. Mosha Pasumansky is well known in the OLAP community for his Microsoft OLAP information website which contains a collection of technical articles and other resources related to Microsoft OLAP and Analysis Services. He also has a blog dedicated to MDX and Analysis Services. He spoke at Microsoft conferences such as TechEd and PASS, and he published the book Fast Track to MDX. As of 29 December 2009, Mr. Pasumansky had shifted his focus[1] to Bing, the Microsoft Search Engine, and is no longer maintaining his active stewardship of the BI Community. We are going to miss him and his articles regarding OLAP, MDX and Business Intelligence in general.
Source Wikipedia
This is an online version of the MDX Studio product build by Mosha. The full version can be downloaded from http://www.mosha.com/msolap/mdxstudio.htm For discussion, bug reports, feature suggestions etc - please visit our blogg here. Here is the link to MDX Studio Online: http://mdx.mosha.com/default.aspx
Mosha Pasumansky is one of the inventors of the MultiDimensional eXpressions (MDX) language, a query language for online analytical processing (OLAP) databases. Pasumansky is also one of the architects of the Microsoft Analysis Services, and an OLAP expert. Mosha Pasumansky is well known in the OLAP community for his Microsoft OLAP information website which contains a collection of technical articles and other resources related to Microsoft OLAP and Analysis Services. He also has a blog dedicated to MDX and Analysis Services. He spoke at Microsoft conferences such as TechEd and PASS, and he published the book Fast Track to MDX. As of 29 December 2009, Mr. Pasumansky had shifted his focus[1] to Bing, the Microsoft Search Engine, and is no longer maintaining his active stewardship of the BI Community. We are going to miss him and his articles regarding OLAP, MDX and Business Intelligence in general.
Source Wikipedia
This is an online version of the MDX Studio product build by Mosha. The full version can be downloaded from http://www.mosha.com/msolap/mdxstudio.htm For discussion, bug reports, feature suggestions etc - please visit our blogg here. Here is the link to MDX Studio Online: http://mdx.mosha.com/default.aspx
Subscribe to:
Posts (Atom)