Adaptive BI: Engineering a BI Solution 4.03


Are you starting to create a BI solution....but where do you begin with? How to setup everything correctly so that you'll be able to handle new features required by the customers easily and without too much effort? Which standard do you put in place so that everyone who will join the project will be immediately operative? How do you define your BI architecture so that it can be sound and flexible and the same time? All these questions come out every time someone has to start a BI project. In this session I'll present a set of standard rules we use to create our BI project, consolidated in more than five years of development (since SQL 2005 was released) and that allow to put some technical rules that helps you to start in the correct way right from the beginning: naming convention, architecture decisions, database unit testing, layering and everything you need to know to build an "Adaptive" Business Intelligence solution. The idea is to be flexible in terms of architectural decision but have some well-known rules in the solution to make it “engineerable”: in one word..."Adaptive"!

Comments on this Talk

Avatar-missing-icon-05 suhail ali, 20 Sep 07:03 PM

Great architecture concept of separating concerns while building the DW that get's often overlooked and presented well. Areas for improvement in the presentation would not to position the slides on schema and naming standards as a necessary evil in order to get to the demo. The demo merely illustrates how it's easy to automate or maintain the solution if rules are put into place. Keep the presentation focused on the architecture concepts such as separation by schema and database by discussing the role and responsibility of each schema in detail. The implementation details such as naming standards or best practices of minimizing stored procedure calls can be a link to a document where the user can read up more on the details. The audience will forget the implementation details anyways so it's better for it to be a reference document. Having the demo interwoven with the slide deck would help visualize the concept as the presentation proceeds so there is no rush at then end to get through the presentation. Maybe here you can touch upon implementation best practices instead of the slides.

An interesting and useful automation would be to somehow run rule checks against say the staging or DW package solution to find out where rules are not being implemented correctly - queries not going to right schema or naming standard not being followed. Users can put in their own rules as well.

I'm looking forward to the dimension package generator being released November. Hopefully there will be a fact package generator soon afterward.

Avatar-missing-icon-07 Judy Flora, 20 Sep 07:04 PM

Davide, I thought your presentation was great.
Somebody at SQLPass (not you) should have been monitoring the feedback to mute off the microphones of people talking. That was a bit annoying and not your fault. I do hope that I will be able to access the recording of your presentation AND the Powerpoint slides from the website.

Unfortunately I am not able to attend the SQLPass conference and hear your extended presentation of this material. It would be helpful to see more details of SSIS packages.

When I have time, I would like to listen again to your recording and perhaps I will have more specific questions and comments at that time.

Thanks very much. Judy Flora Workers' Compensation Board of Nova Scotia Halifax, Nova Scotia, Canada

(I am American living in Canada, but my fathers' parents were from Naples and Sicily, and I hope to visit your country someday!)

Avatar-missing-icon-08 suhail ali, 27 Sep 04:18 PM

I've just reviewed the slides and there were a few areas that could be clarified especially since we have implemented on many of your recommended best practices:

  1. Does the staging database keep history such that the DW can be re-loaded from the staging database? If the answer to this question is yes then creating a fact snapshot table would require creating snapshot of the source data in the staging schema as well. This may also means that files being loaded into the ETL schema may require keeping history in the database versus archiving files for ease of access during reloads. This also raises an unrelated question as to why data coming from a another database landing the staging schema versus data coming from a file lands in the ETL schema. Shouldn't all data land in the same schema?

  2. Do the rules allow cross database queries between staging database and the DW database. So any thing loading into the tmp schema in the staging database might be merged into a dimension in DW. Does this mean the two databases for all practical purposes reside on the same database server?

  3. I'm sure this is a feature for a future release of the dimension generator but handling multi-valued dimensions is a bit involved and the generator can help in this area. What I mean by multi-valued dimension is when bridge table is necessary to map a dimension to a fact.

  4. I know the merge statement is far superior in performance versus using the SCD component in SSIS but I'm not convinced for very large dimensions (200K+) that merge scales well versus doing batch updates. I believe Andy Leonard alluded to this in his last webcast but he too didn't have any hard evidence at this point. Anyways, something to keep in mind.

  5. I can appreciate the benefit of having prepare packages to land data in the DB when doing complex calculations to aid in debugging and breaking the scope of packages down to single purpose packages but could the same goal be achieved by caching the data in SSIS and using data viewers for debugging purposes. The reason I ask is that typically you want to do most of your transformations in memory to achieve high performance ETL instead of relying on landing the data to tables. I rather have the rule to do all ETL in memory and then break the rule in special cases where maintainability is more important than performance.

  6. If you need someone to test the dimension generator before PASS. I'll be happy to test it for you!

Avatar-missing-icon-03 cheap links of london, 18 Oct 01:24 AM

It is quite common that women usually have the desire to buy new jewelries which are seemed to be better than the old ones. Undoubtedly, jewelries like cartoons and animals are suitable for them. And links of london usa having rich meaning of religion or metaphysics are also right, but amber and crystal are the best choice. Since once the cheap links of london and dresses are well matched, the personal taste and qualities will be revealed. However, as for many people, it is too hard for them to match links of london Rings properly. Some suggestions on choosing and matching links of london jewelry are as follows. Hope to help you! Great attention should be paid to the coordination of links of london charm you chose to your face, complexion, temperament, hairstyle, etc. so that you can obtain the most suitable links of london bracelets jewelries. The basic criteria is to choose the jewelry whose shape is neither resemble your face shape nor totally opposite your face shape. This is the ideal face shape, and actually, many kinds of links of london Necklaces could go for this kind of face perfectly, for example, triangular shaped earrings. Just don't wear incredibly long shapes that will just pull you down. Hairstyle plays an essential part in the decoration of the head. Therefore, you need to consider your hairstyle when choosing your right jewels links of london jewellery .

Have an account? Sign in or register.

Leave a Comment

5 Ratings: 4.03

Delivery: 3.84

Content: 4.22

Last Five Ratings

Time & Location

September 20, 2010 — 12:00 PM
LiveMeeting (Map It)