Feedback requested reagrding my current approach | SQL Server Performance Forums

SQL Server Performance Forum – Threads Archive

Feedback requested reagrding my current approach

Hi, I would request feedback from experienced users regarding if the approach that I am using for my "BI" project is on the right track or otherwise. Here’s how I am doing the various steps: 1) In the OLTP, I have defined views which "make available" the data that needs to be brought into my data mart 2) On each run, my DTS’s (bringing in the data from the OLTP into my data mart), drop and create the tables and then populate them using the views mentioned above… The reason for dropping and creating the table is becuase in the early days (when I started this learning process) I was having problems can recall the details but these were related to my time dimension and also trying to figure out what data to append and what not to append…. 3) In my data mart, I have created views on which the various dimensions and fact tables are based 4) I am doing a FULL process of dimension and cubes rather than incremental. I am doing this hoping that this is the right approach and that this will ensure that everything goes right. 5) I have used MOLAP all across. Should I consider other options if performance is the key criteria? 6) All my dimesions are shared dimesions. 7) I have set the performance to 100% I would appreciate some comments from theexperts against the various stages that I have listed above and how I am going about each. Will appreciate if you can share your recommendations regarding considering alternatives based on your experience. Many TIA.
I’m no BI brainbox, but I do look after a mart, and i’d say (before Raulie corrects me!!) 1. Fine
2. I do truncate/drop indexes/DTS in/recreate indexes – no real need to drop the tables but not much difference really
3. good
4. Yes – incremental cube-process seems to be too complex and scary!
5. go MOLAP, yes – only reconsider IF your load/process times get too long. MOLAP will mean faster cube queries..
6. yep
7. you have to trial & error this i think hope this helps a little bit.. Tom Pullen
DBA, Oxfam GB
Hi Tom, Many thanks for your feedback. This is REALLY very encouraging for me becuase I started all of this without any training (but with great help from forums)and it is good to know that I gotstartedonthe right track. Long long way to go but this indeed gives me some confidence… Regarding "before Raulie correct you". Yes Iwould agree with you becuase based on my observation so far, Raulie seems to be really very experienced (KUDOS)and has been of GREAT help so far… Thanks again Tom.
Joozh, m’learned colleague has been discussing item 7 with me and he says go for 20% – no need to go for 100%, it’ll take far too long to design the aggregations. We use about 10% or so and performance is fine because of caching etc. Tom Pullen
DBA, Oxfam GB
Thanks Tom, Here I go again with another question/botheration: Can you help me set-up caching so that 10 or 20% will give almost the same results. In case the solution calls for both H/W and S/W aspects then I will really appreciate if you can help me with the details….
I don’t think you need to DO anything, A.S. is quite intelligent at sorting out caching itself. Because there is a lot of overlap, i.e. a lot of your queries are similar or are asking similar questions, A.S. can satisfy them rapidly. Tom Pullen
DBA, Oxfam GB
Thomas your replies all 7 are on point. [8D] Joozh on point 7 I just wanted add some comments. I think 100% might be too high. Like Thomas mentioned test performance and choose a percentage that fits setting too high of an amount will just take up unnecessarily disk space and slow processing time. Go with a lower amount 20% if you see your queries have slown down then work your way up. Remember that aggregations aggregate at the lower levels in your dimensions, so in effect can roll-up to higher levels on demand. By the way thanks guys for your commments [:I] I really enjoy working in the Business Intelligence space and sharing my knowledge with others.
Raulie
Hewlett-Packard Company
Hi Tom and Raulie,<br /><br />Thanks you for your replies and feedback <img src=’/community/emoticons/emotion-1.gif’ alt=’:)‘ /><br /><br />if you don’t mind me asking: how longhave you been working with DW, OLAP, BI and all this stuff. I have to say that I am impressed with knowledge and would just like to get a rough idea regarding when will I come across at least a rest area in the very very long journey <img src=’/community/emoticons/emotion-1.gif’ alt=’:)‘ /> Will appreciate your reply.<br /><br />
Joozh, our DW went live in early 2003. It was coded by a contractor but support for it is with me and a permanent developer. We are just embarking on a project to implement incermental loads for it, as the load time is getting very long (due to complexity of source data, it was intitially designed to be loaded in its entirety every night – but it has gradually slowed down with data volume growth). So the short answer is – just under 2 years. Tom Pullen
DBA, Oxfam GB
Thanks Tom. You know information such as this also helps and lets a person (like me) see where I stand, get somewhat of a rough idea if the learning/progress has been okay or bad, how long it takes for others and how much further down is the 1st "rest area"….. Ahhhhhhh <img src=’/community/emoticons/emotion-1.gif’ alt=’:)‘ /><br /><br />BTW, what front-end are you using and your experiences with it? <br /><br />Thanks again Tom. <br />P.S<br />I hope Raulie can also share some experience.<br />
We use Excel XP (pivot tables services). It works a treat. Business Analysts also query the MART SQL Server tables too using SQL Server tools (QA mainly). Glad to be of help, and good luck on your learning curve. There is a rest area in sight, 1.5 days to go, it’s called the weekend! Yippee!! Tom Pullen
DBA, Oxfam GB
Thanks Tom. And thanks for helping me clear my eye sight. Oh yesss, I see it too (the upcoming rest area) <img src=’/community/emoticons/emotion-1.gif’ alt=’:)‘ />
As for me I moved from Production DBA to full time BI Developer not too long ago, however I have had prior expierence working in the OLAP space with other projects. Raulie
Hewlett-Packard Company
Thanks Raulie.
]]>