SQL Server Performance

Cube Designing

Discussion in 'Analysis Services/Data Warehousing' started by sqlboom, Jul 7, 2004.

  1. sqlboom New Member


    Currently I'm working in Analysis Services. I've faced some difficulties at the time of designing a cube. So can anybody help me giving the links of good documents related to designing a cube?

  2. SQL_Guess New Member

    hope you find some - I'll be interested in seeing them as well !!!

    Panic, Chaos, Disorder ... my work here is done --unknown
  3. SQL_Guess New Member

  4. sqlboom New Member

    But I am searching some links that say the complexities arise at the time of designing a cube.
    Will you pls help me in this matter?

  5. satya Moderator

    If you have large complex cubes I do recommend using the 3GB memory setting, remember if you do upgrade to W2KAdv.Server to configure Analaysis Services to use 3GB you have to set it in the registry the AS Manager only allows the setting of 2GB.

    While processing the cube have a look at the process under Task Manager for MSDRV.EXE process to assess the memory usage.

    Remember that analysis services reads all cube metadata and Dimension members into memory at start up. Also if you have dimension or cell level security for numerous roles these too eat up memory as the elements the security role allows are read into memory and remain there until restart or reboot.

    IN anycase ensure to check the latest service pack or any hotfix for Analysis Services.

    Satya SKJ
    This posting is provided “AS IS” with no rights for the sake of knowledge sharing.
  6. Raulie New Member

  7. satya Moderator

    Thanks Raulie, I believe the link referred by you is more relevant in terms cube development.

    Satya SKJ
    This posting is provided “AS IS” with no rights for the sake of knowledge sharing.
  8. Raulie New Member

    No problem. However poster did not mention exact design difficulties he is facing.
  9. sqlboom New Member

    Problems are galore..

    The datamart has been in place for years.
    Designing cubes suggest not to use varchar columns as measures. I did so and ran into trouble discovering that countdistinct aggregate function does not work for it. Obvious solution was to have this as a dimension and put a calculated measure. Now, as a dimension, it can overflow the max limit for a dimension column (64k).
    I have tables which talk to each other through part of a column like
    table1.colA = substring(table2.colB, 1, 3).

    Hence the curisoity to know the design issues with AS beforehand.
  10. stevefromOZ New Member

    I think the ops guide (see links posted earlier by Raulie) indicates that using integers /numbers for keys is better than varchars/chars, which is a practice that I have used since starting with Cognos and then moving to AS. I do this more for space saving and consitency with the source DB.

    The 64k member issue can be overcome, either manually make levels between the parents that have 64k children, or use the in-built mechanism thta does this for you, but hides the levels from the end user/s. --> look up 'Creating Member Groups' on Books onLine.

    I would always go for making those joins in the views over the DB rather than asking AS to do them, or even better, re-model the DB to have nicer joins [xx(]

    There are a couple of books around but i've found one yet that wouldclearly answer your questions, but having said that, the Ops Guide (ms website) does offer some good hints on how to best implement yuor AS cubes. Also, not sure if I posted it here earlier, but there is a good web-preso by Dave Wickert that gives his Top 10 tips for AS.




Share This Page