Thursday, November 20, 2008

Business Intelligence 2.0

Business Intelligence which consists of Dashboard Systems and Decision Support Systems has grown up during long time. Most systems that BI developers have been developed are belonging to Dashboard Systems which are used for demonstrating KPIs and revealing what has happened before. Decision Support Systems which are very depends on using Data Mining advantages for being implemented, have developed much fewer than Dashboard Systems.

BI 2.0 not only focuses on these systems, but it also changes the ETL way that we used to. Developing more Service oriented Applications, and the fact of having service-oriented data sources will change the way that we are using for gathering and cleaning data. Moreover, the needs of considering Business Processes and Business Rules, which exist in BPEL and BRMS, for performing ETL process would change this industry a lot.

I think the next generation of BI has a lot of challenges which are very interesting to study.

Sunday, October 26, 2008


Designing appropriate aggregations in SQL Server Analysis Services is a big issue that a BI developer must consider carefully. When I read Performance Optimization chapter of Professional SQL Server Analysis Services 2005 with MDX, I felt that I should study designing aggregation in more detail if I wanted to get a good performance on Enterprise Data Warehouse.

Although "Microsoft SQL Server 2005 Analysis Services Performance Guide" has been published in February 2007, I find it today! It describes some important tips in detail.

If you haven't read it, don't miss it.

Tuesday, September 9, 2008

Change data capture

Identifying and extracting changed data from source tables was replete with lots of efforts; it also requires designing the databases in an appropriate manner for keeping the changed data within tables or log files. The concept of “Change data capture”, a set of software design patterns used to determine (and track) the data that has changed in a database, helps designers to overcome the difficulty of solving such problems.

Fortunately, SQL Server 2008 implemented a mechanism to enable DBAs for automating this process like what we could consider in Oracle 9i which is called “Capturing Change”.
By using this new feature we could specify a log table in which the changes in data could be tracked. These changes are insert, update, and delete. SQL Server 2008 handles the required operation to store the appropriate data.

The following links are very helpful for understanding this new feature which could make ETL process more easily.

Monday, August 11, 2008

One Year Blogging Anniversary

It was last august that I started writing within this blog. I started blogging just for making notes on all thinks that I learn about Business Intelligence. I think it is a good idea to write about what we are learning and doing because it makes it possible for us to judge about ourselves and compare each year efforts with other years.
Although it played the role of technical diary for me, it led me to get familiar with a lot of friends such as Pedro, Ella, and others. It was definitely the best experience that I had because I could find friends, and expand my knowledge and consider others’ opinions and roadmap.
I want to apply for a master degree this year, and I hope that I can find a proper master degree. I get the IELTS, and I am currently search for universities while I am contacting with their teachers for perceiving more about their courses. Hard Working and all these activities make writing a bit difficult for me, but I will try to continue blogging, especially now that SQL Server 2008 has released ;)
I am very eager to study about accounting, cash-flow, budgeting, ERP, supply chain management, and etc because I think an in-depth knowledge of these topics could help me to analyze KPIs better.

Thanks you

Best wishes,
Amin Jalali

Friday, July 25, 2008

Microsoft DreamSpark

Microsoft created this site for students in order to access the Professional Developer and Designer tools at no charge.It is very helpful that you can download and use softwares like "Microsoft Visual Studio 2008 Professional Edition", "Microsoft SQL Server 2005 Developer Edition", and etc at no cost.

Sunday, April 13, 2008

Market Basket Analysis

Some time ago, one of the Accounting Division’s staff asked me to reveal if the last decision of the Bank about introducing new participation papers were successful or not; also, he asked me to specify what percent of its sales were because of changing other deposits.
It was a hard job, because we had not any clear idea on how to deal with this matter! It had done, but I kept thinking about solving this problem via Business Intelligence Solutions rather than executing complex queries on databases.
When I was reading “The Data Warehouse Toolkit Second Edition”, I realized that it is very similar to “Market Basket Analysis” concept, and after doing some search it began to sink in.
Indeed, I found it out that it could be done via association algorithm of the Data Mining. This subject is about the market basket analysis, and finding the association of products based on their sales, in order to offer new discount for increasing the sales.
For instance, you can imagine that when you find it out that the toothpaste and toothbrush were sold together in most situations, it could be a good starting point for designing new promotion!

Saturday, March 29, 2008

Conformed Dimension

The word of Enterprise Data Warehouse is very nice at the first glance, yet when businesses want to estimate its cost, they usually prefer to have some important data marts that drive their most important business goals; in order to break its enormous cost. Beside, having these kinds of data marts could be a huge problem especially when those could not integrated with each other during the time that those are created.
Having bus architecture is undoubtedly one of the vital parts of building an enterprise data warehouse during the time, for it will provide the join possibility between the data marts by specifying the conformed dimensions.
Conformed dimensions are what which are common between data marts and make the drill across possible. Off course, it could not be eliminated from enterprise main features.
Finally, the most common problem that you may encounter during building the conformed dimensions is different level of grains that different data marts are needed, so it is recommended that you design dimension’s levels correctly which will even be helpful for drilling through actions.

Saturday, March 1, 2008

Factless Fact Tables

I have recently got familiar with the concept of “Factless Fact Tables”, and due to the fact that it was very interesting for me, I like to share it with you, if you are not already familiar with it.
The “Factless Fact Table” is a table which is similar to Fact Table except for having any measure; I mean that this table just has the links to the dimensions. But what is the benefit?
These tables enable you to track events; indeed they are for recording events.
Another kind of this Table is known as “Factless Coverage Table” which is very usefull. Imagine that you have a retail stores and each store has its own promotion policy. It would be very complicated if you wanted to answer this sort of question: “Which products were on promotion that didn't sell?”
The best way for covering these kinds of questions is the coverage tables. These tables hold the coverage data for answering these questions.
I also recommend reading “Factless Fact Tables” that was written by Ralph Kimball.

Wednesday, January 23, 2008

“Other” member in Data Visualization

One important and fabulous thing that I learned from "Data Analysis Using SQL and Excel" book is how I can represent correct data using correct SQL query.
Imagine that we have an orders table in which we have customer’s orders. Now we need to present the following query:
What is the distribution of the number of orders in the 5 states that have the largest number of orders? (By representing the Other‘s as a other category)



Friday, January 4, 2008

Analyzing vs. Designing

I want to describe the difference between "Data Mining with SQL Server 2005" and "Data Analysis Using SQL and excel" in general.
This difference is derived from the distinction between analyzing and designing data mining systems.
For analyzing a Data Mining system, you must use some tools that have the potential of rapid development, in order to interact with the stakeholders and users quickly. In this situation, I recommend using Excel and SQL to clarify the results for users, due to the fact that, in this situation there is not any implemented mining model on OLAP server.
But if you are designing a Data Mining system, you must have a good knowledge of the OLAP server and the development tools that it has.
As a result, I think the “Data Mining with SQL Server 2005" suits for Designing, while the "Data Analysis Using SQL and excel" is appropriate for Analyzing.

I prefer to read both of these books, in order to handle the analyst and designer roles in a BI project, Maybe it is because of the fact that, I could not find anyone who can do one of these roles.
But I prefer to read the “Data Analysis Using SQL and excel” at first, then involve in “Data Mining with SQL Server 2005”.
I think it is worth to buy “Data Analysis Using SQL and excel”, but I have not any idea about the other book, but it is not so important, because there are not any other books which describe Data mining with SQL Server 2005, practically.
Finally, I do not think these books can be used instead of each other. In fact, they must be used as complementary.