Limitations of AS2005

22 views
Skip to first unread message

pras

unread,
Aug 22, 2005, 1:43:34 AM8/22/05
to
Hi,

Can any one working and using Yukon AS2005 have felt any limitations of
AS2005 compared to the features in AS2000.

Regards,
Prasanna

Chris Webb

unread,
Aug 22, 2005, 8:31:58 AM8/22/05
to
I think it's still too early to say, some functionality has changed quite
significantly since the June CTP and is still changing so any criticisms made
now might not be valid at RTM. Also, it's very easy to think something's
worse just because it forces you to work in a different way to how you're
used to: for example when I first started using Business Intelligence
Developer Studio (with its need to deploy changes after you've made them) I
used to prefer Analysis Manager, but now I'm used to it I wouldn't go back.

Chris

--
Blog at:
http://spaces.msn.com/members/cwebbbi/

Mosha Pasumansky [MS]

unread,
Aug 22, 2005, 2:28:26 PM8/22/05
to
> used to: for example when I first started using Business Intelligence
> Developer Studio (with its need to deploy changes after you've made them)
> I
> used to prefer Analysis Manager, but now I'm used to it I wouldn't go
> back.

Only true when you work in the Project mode. I, personally, prefer to work
in Online mode, where any changes you make are saved directly to the
server - this is closer model to what Analysis Manager provided.

--
==============================­====================
Mosha Pasumansky - http://www.mosha.com/msolap
Analysis Services blog at http://www.sqljunkies.com/WebL­og/mosha
Development Lead in the Analysis Server team
All you need is love (John Lennon)
Disclaimer : This posting is provided "AS IS" with no warranties, and
confers no rights.
==============================­====================


pras

unread,
Aug 23, 2005, 12:51:40 AM8/23/05
to
Hi,
Thanks for the response.
Yes AS2005 has very good features compared to AS2000.
I have some below queries need to answered.

1. Since AS2005 doesn't support DSO. if i need to create dynamic
partitions how
should i go about.
2. Earlier cube could have been archived with upto 2 GB of data so
anyone could
analyze.Now with AS2005 one has to deploy first and then process
the data then only
one can analyze the data.
3. I am not convinced well with Proactive caching what advantages you
get with it the cubes in hand is it olny for processing the data or
will it help in querying also.
4. The browser ships with BI Studio what is the maximum capacity till
it works fine

I may still have some more queries which i will post it later

Regards,
Prasanna

Mosha Pasumansky [MS]

unread,
Aug 23, 2005, 4:02:25 AM8/23/05
to
> 1. Since AS2005 doesn't support DSO. if i need to create dynamic
> partitions how should i go about.

Actually, AS2005 still supports DSO (in form of DSO9), but this is for b/c
purposes, and it will only work on migrated cubes. The replacement for DSO
is called AMO, and it is managed object model.

> 2. Earlier cube could have been archived with upto 2 GB of data so
> anyone could analyze.Now with AS2005 one has to deploy first and then
> process
> the data then only one can analyze the data.

I don't think I understood what you mean here, especially I didn't get the
relationship between archiving, deployment and processing.

> 3. I am not convinced well with Proactive caching what advantages you
> get with it the cubes in hand is it olny for processing the data or
> will it help in querying also.

Proactive caching is there to keep the cubes up-to-date and with fast
access. It is not for querying.

> 4. The browser ships with BI Studio what is the maximum capacity till it
> works fine

If you mean embedded cube browser - it is simply OWC11 which has been
shipping for few years now.

pras

unread,
Aug 23, 2005, 5:38:14 AM8/23/05
to
Hi,
Thanks for the very fast response.

My Query No:2 was in As2000 one could have archived the entire
processed OLAP database into a CAB file and then deploy on to the
production or can even use the CAB for analyzing the data but in AS2005
with the evolution of BI studio closely integrated with the .Net, one
has to deploy the solution file and build only blank cubes get
generated without data. I want to confirm whether i am conveying what i
mean.

I am little bit confused about the jargon UDM, How do i define a UDM in
an BI studio

I appreciate the translation feature in AS2005 but how do i use it if i
need the member names in French,morover translation looks like a
tedious one.

Please reply

Regards,
Prasanna

Dave Wickert [MSFT]

unread,
Aug 23, 2005, 9:27:22 AM8/23/05
to
If you look at SSMS (Management Studio), you will see management options to
backup and restore a database. Thus if you want to do deployment the same
way you did with AS2K you can.

In As2K5, there are more options, for example deployment by-hand with BI Dev
Studio, using SSIS packages, using the AS deployment wizard, server and
database sychrnoizing, etc. Thus you will find in AS2K5 that the question
you will be asking yourself is what way is best in a given situation. You
have lots more options.

--
Dave Wickert [MSFT]
dwic...@online.microsoft.com
Program Manager
BI Systems Team
SQL BI Product Unit (Analysis Services)
--


This posting is provided "AS IS" with no warranties, and confers no rights.


"pras" <pras...@gmail.com> wrote in message
news:1124789894.8...@f14g2000cwb.googlegroups.com...

Vania

unread,
Aug 23, 2005, 11:15:02 AM8/23/05
to
In my opinion, a significant limitation of AS2005 is the fact that the Drill
Through can't go to the atomic level no more (I mean, if I go to the leaf
level, I see the data aggregated to leaf level, not the single records who
generated the leaf level, with other significant columns coming from the Fact
Table).

Mosha Pasumansky [MS]

unread,
Aug 23, 2005, 1:34:57 PM8/23/05
to
But why wouldn't you see the transaction level data if you added fact
dimension to your cube (potentionally leaving it ROLAP dimension).

--
==============================­====================
Mosha Pasumansky - http://www.mosha.com/msolap
Analysis Services blog at http://www.sqljunkies.com/WebL­og/mosha
Development Lead in the Analysis Server team
All you need is love (John Lennon)

Disclaimer : This posting is provided "AS IS" with no warranties, and
confers no rights.
==============================­====================
"Vania" <Va...@discussions.microsoft.com> wrote in message
news:24E32F4C-3EFC-439F...@microsoft.com...

Deepak Puri

unread,
Aug 23, 2005, 4:27:31 PM8/23/05
to
As Mosha mentioned, you can still get atomic level fact data using
DrillThrough in AS 2005 - see this MSDN paper:

http://msdn.microsoft.com/library/default.asp?url=/library/en-us/dnsql90
/html/sql2k5_anservdrill.asp
>>
Enabling Drillthrough in Analysis Services 2005

T.K. Anand
Microsoft Corporation

July 2005

Applies to:
SQL Server 2005 Analysis Services

Summary: Discover the new Analysis Services 2005 drillthrough
architecture. See how to set up drillthrough in Analysis Services 2005
and get guidance on migrating drillthrough settings from Analysis
Services 2000 databases.
..
>>


- Deepak

Deepak Puri
Microsoft MVP - SQL Server

*** Sent via Developersdex http://www.developersdex.com ***

pras

unread,
Aug 24, 2005, 1:24:29 AM8/24/05
to
Hi,
Thanks again for all for their valuable response. to continue with the
AS2005 please find some more or the one's which i haven't got any
answers yet.

They are as below

1. I am little bit confused about the jargon UDM, How do i define a UDM
in
an BI studio


2. I appreciate the translation feature in AS2005 but how do i use it


if i
need the member names in French,morover translation looks like a
tedious one.

3. What about virtual cubes and virtual dimension in AS2005.I feel they
do not exist any more. I wanted to know the virtual cubes
have been replaced with what feature?

Please reply

Regards,
Prasanna

Vania

unread,
Aug 24, 2005, 3:39:03 AM8/24/05
to
As from the article you mentioned me:
"The biggest impact of the new drillthrough architecture on cube designers
is the requirement that all drillthrough columns must be part of the cube
schema. Cube designers must include additional attributes and measures in the
cube. This can increase the schema complexity and the data size."
This can be quite cost-effective if you in the drillthrough action want to
see, let's say, 80 attributes that hadn't to be aggregated, from a facte
table with millions of rows.

Further, it seems a uneseful effort, since, as from the Books on Line,
referring to DrillThrough: "This statement allows the client application to
retrieve the rowsets that were used to create a specified cell in a cube. A
Multidimensional Expressions (MDX) statement is used to specify the subject
cell. If this cell is at an atomic level (that is, at the lowest level of its
hierarchy), only one rowset is returned. ".
I think the usefulness of DrillThrough was mainly the one of supplying the
rowsets at the atomic level (from the relational db), otherwise it simply
allows doing the same I do by opening with the browser a lower level of a
specified cell.

It seems that now, the only way for doing so is to write an application who
SQL queries the relational database having the cell coordinates. Do you agree?

"Deepak Puri" wrote:

> As Mosha mentioned, you can still get atomic level fact data using
> DrillThrough in AS 2005 - see this MSDN paper:
>
> http://msdn.microsoft.com/library/default.asp?url=/library/en-us/dnsql90
> /html/sql2k5_anservdrill.asp
> >>
> Enabling Drillthrough in Analysis Services 2005
>
> T.K. Anand
> Microsoft Corporation
>
> July 2005
>
> Applies to:
> SQL Server 2005 Analysis Services
>
> Summary: Discover the new Analysis Services 2005 drillthrough
> architecture. See how to set up drillthrough in Analysis Services 2005
> and get guidance on migrating drillthrough settings from Analysis
> Services 2000 databases.

> ...

Deepak Puri

unread,
Aug 24, 2005, 12:55:52 PM8/24/05
to
Based on my limited experience so far with AS 2005 drillthrough, I think
that you should be able to return the fact rows which underly a cube
cell, as in AS 2000. The key is setting up the fact table for the
measure group as a degenerate "fact" dimension, and configuring the
ROLAP storage option, to avoid the overhead of MOLAP storage at the
granularity of the fact table. From the paper:

>>
Degenerate dimensions are typically large since their cardinality is the
same as the fact table. Cube designers who are concerned about the size
can set the storage mode of the dimension to ROLAP. Dimensional analysis
that does not query the degenerate dimension will be unaffected, but
drillthrough queries will be slower as a result.
>>

So the fact table is now modelled as a dimension of the cube, but any
access to that dimension will result in a relational query (as does
drillthrough in AS 2000). But this also allows fact-level data to be
accessed via a conventional MDX query, without using drillthrough (which
isn't possible in AS 2000). What I'm still seeking for is how to manage
query performance and overhead with this increased flexibility, so that
the solution works well.

Mosha Pasumansky [MS]

unread,
Aug 24, 2005, 8:46:21 PM8/24/05
to
> What I'm still seeking for is how to manage
> query performance and overhead with this increased flexibility, so that
> the solution works well.

As long as you don't query these fact (aka degenerate dimensions), there
will be no impact on performance. You can even make them hidden, so users
don't drag-and-drop them accidently, but you can still refer to them inside
your report or drillthrough actions.

> It seems that now, the only way for doing so is to write an application
> who
> SQL queries the relational database having the cell coordinates. Do you
> agree?

This may turn out to be pretty complex application, because it will have to
support all the flexibility and power of DSVs in addition to having to deal
with multiple partitions etc. So creating fact dimensions, is indeed the
correct approach.

Vania

unread,
Aug 25, 2005, 3:22:01 AM8/25/05
to
I know about the degenerate dimensions, thank you. But if I have to do a cube
which is not the required one, with 'strange' dimensions, how can I justify
and explain it to the final user? For this reason and for the huge overhead
(in preparing, processing and in showing the right way to the user) I don't
appreciate very much this work-around, and prefer the solution to create an
application who SQL queries the fact tables through the cell coordinates...
Maybe it's a matter of opinions, I don't know...
Thanks again!
Vania

Deepak Puri

unread,
Aug 25, 2005, 11:55:53 AM8/25/05
to
Well, I think that Mosha described some options which would minimize the
AS 2005 drillthrough impact for users:

- Hide the degenerate dimension, so they don't see it

- No performance impact, when not accessing it

So, the only "overhead" that I can see is your own learning curve -
users can still use MDX DrillThrough statement, with configurable fields
returned, as in AS 2000. But if you find this overhead greater than that
for building a bespoke app, you can certainly choose the latter.

pras

unread,
Aug 29, 2005, 1:04:05 AM8/29/05
to
Hi,
i was building a dimension for product using Adventure Works DW
database.


Steps i followed are
1. Dim product as main table for the Product table for the standard
dimension.
2. Selected productkey as the main key


3. Selected DimProductCategory, DimProductSubCategory the related
tables.


4. Selected Attributes


a. English product Category name
b. English Product Sub Category name
c. English product name
d. Color
e. Style
f. Weight


5. Selected dimension type as regular


6. No parent child relation ship specified as it is standard
dimension


7. Next Pane Dimension contains no hierarchies


8. Named the dimension as Product, Click finish


9. Created product hierarchy containing
a. product Category (renamed from "English
product"Category name)
b. Product SubCategroy (rename from English Product Sub
Category name)
c. Product name (renamed from English product name)


10. process the dimension


11. It is giving me series of errors
a. The following system error occurred: No mapping between
account names and security IDs was done.


( I COULD NOT really understand what it is speaking of)


b. Errors in the high-level relational engine. A connection
could not be made to the data source with the DataSourceID
of 'Adventure Works DW', Name of ''.


( I HAVE CREATED DATASOURCE 'Adventure Works DW' allready)


c. Errors in the OLAP storage engine: An error occurred while
the dimension, with the ID of 'Dim Product', Name of 'Dim
Product' was being processed.


d. Errors in the OLAP storage engine: An error occurred while
the 'English Product Name' attribute of the 'Dim Product'
dimension from the 'Analysis Services Project2' database
was being processed.
e. Errors in the OLAP storage engine: An error occurred while
the 'Color' attribute of the 'Dim Product' dimension from
the 'Analysis Services Project2' database was being
processed.
f. Errors in the OLAP storage engine: An error occurred while
the 'Weight' attribute of the 'Dim Product' dimension from
the 'Analysis Services Project2' database was being
processed.


i could not understand what are all these errors


Can any one explain what is the simple mistake which has happened


Regards,
Prasanna

pras

unread,
Aug 30, 2005, 7:47:02 AM8/30/05
to
The simple answer for this. Right click the data source Open and in the

Impersonation information select the radio button " use the service
account". this might be beacuse the Olapservice runs under Local
System account


Regards,
Prasanna

Reply all
Reply to author
Forward
0 new messages