OBIEE architecture for massive data extractions

571 views
Skip to first unread message

Daniele DeFaveri

unread,
Aug 3, 2011, 11:57:37 PM8/3/11
to OBIEE Enterprise Methodology Group
Hi all,
all us know that the OBIEE is not an extraction tool and that is not
designed to execute massive extractions from the databases (Oracle
also write it everywhere), but in the real world the final users use
and want to use the OBIEE in this way, so extract freely all the data
they want from the database and then work on it in other ways (usually
excel).
So, I can we be confident and how far the OBIEE could be push for this
kind of extractions (65000 rows more?), where can we work to improve
the performance of the extractions and to minimize the impact on the
OBIEE system (cpu, memory, usability of the system for other users,
etc)?
Is there configuration parameters that could influence deeper this
kind of use of the OBIEE?
Is there differences between 10g and 11g?

And in an architectural prospective, are there other tools that can we
place side by side to the OBIEE and integrate with it to allow a
better work to our users?

Any feedback will be appreciated,

Regards,
Daniele

chet justice

unread,
Aug 4, 2011, 2:56:33 AM8/4/11
to obiee-enterpri...@googlegroups.com
In other words, data dumps?

What about scheduling iBots that send large CSV files in the off hours? 

If you're going to do that though, why not just take the SQL, put it in a shell script and then put it in cron? SQL Developer works well for this too. 

I jump up and down quite a bit when I encounter it (I'm prone to childish behavior). I think there are a couple of reasons this happens (they take the dumps and then do their analysis).
  1. People love Excel. It's the single greatest analytic tool every created. Be careful though.
  2. Usually an inability to teach the end-users the power of SQL/OBIEE. This takes multiple forms; 
    1. Business analysts aren't savvy enough with SQL in general or OBIEE in particular. 
    2. We don't say No.
I love working with the end-users and seeing that light bulb go off in their head. It's rare, but fun.







--
You received this message because you are subscribed to the Google
Groups "OBIEE Enterprise Methodology Group" group.
To post to this group, send email to
obiee-enterpri...@googlegroups.com
To unsubscribe from this group, send email to
obiee-enterprise-met...@googlegroups.com
For more options, visit this group at
http://groups.google.com/group/obiee-enterprise-methodology?hl=en

All content to the OBIEE EMG lies under the Creative Commons Attribution 3.0 Unported License (http://creativecommons.org/licenses/by/3.0/).  Any content sourced must be attributed back to the OBIEE EMG with a link to the Google Group (http://groups.google.com/group/obiee-enterprise-methodology).

Robert Tooker

unread,
Aug 4, 2011, 6:18:05 AM8/4/11
to obiee-enterpri...@googlegroups.com
Any web-based tool is going to struggle with rendering large data volumes. I've pushed 10g to the point where firefox is unusable. 11g doesn't even need much pushing to be unusable probably due to higher volume html, and more complex css like background images on header cells in pivot tables. The default limits are set very low and once you start raising them you soon see why! Another option alongside what Chet suggested is using the Excel plugin but I haven't pushed that very hard so I'm not sure of performance limitations. Regards,

Robert

je...@brewpalace.com

unread,
Aug 4, 2011, 12:55:14 PM8/4/11
to obiee-enterpri...@googlegroups.com
Ahh - this "requirement" - seems like it has been posted every month for the last 5 years in some form or another.

Honestly, if you have a requirement to download that much data for people to use in Excel then you've missed to point of BI and how a BI tool fits in.  Instead you should be asking about what people do with that data in Excel.  What are they doing to it once its in there?  Most likely they are continuing to alter it, add to it, etc in order to produce some other reports.  You should be targeting the output from Excel, not the input to Excel.  Otherwise, you aren't really doing anything - replacing one dump for another.  BI is a aligning with and improving business processes, not providing access to raw data.  The BI tool is very rarely the end of a business process - there is a longer process to consider.  Not improving it and re-doing the same poor process truly is "paving over cow-paths".

Here is an example I came across that illustrates this:

I showed up to a project and they wanted to download 100,000 records.  The "report" requirement had already been written up as such.  Instead of doing that however (I don't follow orders very well), I contacted the person who requested the report and asked what she was doing with that data.  Turns out she had to combine it with another smaller data set to produce a report which our BI system could not.  The solution was to add that small data to our system and produce the result directly in OBI.  
Benefits of this are many:
1 - She is now freed up from doing this rote manual low-value-add work.  Analysts should be doing high-value add work.
2 - The results are available faster than ever before
3 - The output is shareable with many very easily as it is in a dashboard and doesn't require a posting to a shared server or email threads.
4 - The process of producing the output has gone through an extensive QA process, thereby cutting down on manual human errors and improving accuracy
5 - The results are now integrated into the bigger BI environment, and can now be mixed and match with other metrics.  Now some real analysis can be done as opposed to a static report
6 - You might even be able to improve the frequency and timeliness of the results as it can be done say nightly as opposed to weekly/monthly.  Perhaps the current frequency is not really good enough for the business, but that's what they had to settle on due to the manual nature of the process.

I think if you discuss these benefits with a user, they will most likely gladly cut out the boring work they have to do.  If they have job security issues, then you'll have to go higher up to their management, and explain how their team can be more productive.


Jeff M.




From: Daniele DeFaveri <daniele....@email.it>
To: OBIEE Enterprise Methodology Group <obiee-enterpri...@googlegroups.com>
Sent: Wednesday, August 3, 2011 8:57 PM
Subject: [OBIEE EMG] OBIEE architecture for massive data extractions
--
You received this message because you are subscribed to the Google
Groups "OBIEE Enterprise Methodology Group" group.
To post to this group, send email to
obiee-enterpri...@googlegroups.com
To unsubscribe from this group, send email to
obiee-enterprise-methodology+unsub...@googlegroups.com

Andriy Yakushyn

unread,
Sep 24, 2015, 11:10:56 AM9/24/15
to OBIEE Enterprise Methodology Group
Reviving the 4-year old thread. I have a requirement of producing several reports with 500-2000+ columns (few hundred columns with case statements' logic in them) - the output is going to the file. Jeff's suggestion would not work as significant commitments have been made. For this report, I' m considering using OBIEE as an interface (primary requirement is to use OBIEE as an entry point to the report)  to trigger the report and pass filters (whether through Action Framework or simple write-back table, and utilizing PL/SQL or ETL/ELT tool to generate the report on the backend. It seems to be a more efficient approach to me at this time. 

I'd really like to see if anyone has done something similar and can share any additional information. 

Thank you. And I'll be back with more bridge-table discussions :-)

Andriy
Reply all
Reply to author
Forward
0 new messages