Real-time OBI

237 views
Skip to first unread message

Jeff McQuigg

unread,
Mar 21, 2011, 6:01:57 PM3/21/11
to OBIEE Enterprise Methodology Group
Boy this forum has been quiet of late. Lets get a new thread going!

I'm hearing a lot of demand about real-time OBI solutions recently,
and I'm not talking about Fusion OTBI. Are you guys seeing the same
thing?

What are your thoughts on mapping OBI to real-time systems? Which
architectures are you considering or using? What about the pros/cons
of each?
I see about 5:
1 - microbatch ETL
2 - map OBI to source
3 - map OBI to replica of source
4 - map OBI to a transformed and enhanced version of the source
5 - federate across a DW and source

And what about mapping OBI directly to a 3NF model? What about
getting rid of DW/stars all together and just using the source?
This I see as much more difficult to map in OBI, a level 10/10 on
complexity scale unless you are doing just some simple operational
transactional list reports. Furthermore there are some things you
simply cannot do without a DW such as Snapshot Fact tables, SCDs, many
others.

Is anyone working on one now? What are the pain points for it?

Regards,
Jeff M.

Sachin Jain, CPHIMS

unread,
Mar 21, 2011, 6:32:24 PM3/21/11
to obiee-enterpri...@googlegroups.com
5 - federate across a DW and source

This is the model we are working on. The high level reports are driven off of a Essbase cube or a aggregated DW setting, which is a cleaner, de-normalized version of the operational DB.

If the users need to drill to source, we allow for that for a few user by giving them the ability to Drill-through.

Sachin


--
You received this message because you are subscribed to the Google
Groups "OBIEE Enterprise Methodology Group" group.
To post to this group, send email to
obiee-enterpri...@googlegroups.com
To unsubscribe from this group, send email to
obiee-enterprise-met...@googlegroups.com
For more options, visit this group at
http://groups.google.com/group/obiee-enterprise-methodology?hl=en

All content to the OBIEE EMG lies under the Creative Commons Attribution 3.0 Unported License (http://creativecommons.org/licenses/by/3.0/).  Any content sourced must be attributed back to the OBIEE EMG with a link to the Google Group (http://groups.google.com/group/obiee-enterprise-methodology).

Jit Dutta

unread,
Mar 21, 2011, 7:20:19 PM3/21/11
to obiee-enterpri...@googlegroups.com

I am coming across real-time reporting needs too. But do the customers consider using the OOTB Oracle reports for EBS or PS reports for PSoft ERP...

I am not sure how practical the microbatch ETL solution (performance impact on transactional system apart from the same on the OBI reports to some extent)...

Probably doing OBI reports against replicated OLTP data seems like an good one but that also has reporting performance impact having to run queries against an normalized schema...

If we are to build a real-time OBI reports against OLTP schema (replicated or not) it should probably be limited in scope and only as a supplemental role to the main DW based reporting.

As for federated reporting, I wonder if anyone had a good experience with it in real-life client situation...

Jit

----------------------------------------
> Date: Mon, 21 Mar 2011 15:01:57 -0700
> Subject: [OBIEE EMG] Real-time OBI
> From: je...@brewpalace.com
> To: obiee-enterpri...@googlegroups.com

Kris

unread,
Mar 21, 2011, 7:20:42 PM3/21/11
to obiee-enterpri...@googlegroups.com, obiee-enterpri...@googlegroups.com, Jeff McQuigg
Using BI publisher could be an option.


Kris

Sent from my iPhone

On Mar 21, 2011, at 5:33 PM, Shyam Varan Nath <shyam...@gmail.com> wrote:



1 - microbatch ETL  [ possible to do for selected entities, change control can become complex, micro ETL can be done many times a day]
2 - map OBI to source  [ to 3rd normal source, discussed below]
3 - map OBI to replica of source  [ large companies make DR kinda backup of OLTP and use for reporting, may not be easily suited for smaller companies and needs double the disk space for one, no load on OLTP of reporting]
4 - map OBI to a transformed and enhanced version of the source [ seems like the ODS approach, is often done, but does not store historical or analytic info, good for multiple sources]
5 - federate across a DW and source   [ technically possible, makes RPD little complex as hard to join id's across DW pk's and OLTP pk's, join on business keys may have to be done..]

And what about mapping OBI directly to a 3NF model?  What about
getting rid of DW/stars all together and just using the source?

[ Noetix and EIS are the vendors who do that for operational reporting. Oracle's Fusion Analytics - now obsolete did that too. DBI's kinda did that for EBS
Advantage is no real ETL engine needed, but works well for operational, not really analytics reporting like year over year etc.   Also OLTP gets loaded by reporting]



--
You received this message because you are subscribed to the Google
Groups "OBIEE Enterprise Methodology Group" group.
To post to this group, send email to
obiee-enterpri...@googlegroups.com
To unsubscribe from this group, send email to
obiee-enterprise-met...@googlegroups.com
For more options, visit this group at
http://groups.google.com/group/obiee-enterprise-methodology?hl=en

All content to the OBIEE EMG lies under the Creative Commons Attribution 3.0 Unported License (http://creativecommons.org/licenses/by/3.0/).  Any content sourced must be attributed back to the OBIEE EMG with a link to the Google Group (http://groups.google.com/group/obiee-enterprise-methodology).

Jit Dutta

unread,
Mar 21, 2011, 7:28:22 PM3/21/11
to obiee-enterpri...@googlegroups.com

Hi Sachin,

Did you have to rebuild the Essbase hierarchy to match the levels to those in OBIEE or other way around to enable drill down across data sources?

Jit

________________________________
> From: the...@gmail.com
> Date: Mon, 21 Mar 2011 17:32:24 -0500
> Subject: Re: [OBIEE EMG] Real-time OBI
> To: obiee-enterpri...@googlegroups.com


>
> 5 - federate across a DW and source
>
> This is the model we are working on. The high level reports are driven
> off of a Essbase cube or a aggregated DW setting, which is a cleaner,
> de-normalized version of the operational DB.
>
> If the users need to drill to source, we allow for that for a few user
> by giving them the ability to Drill-through.
>
> Sachin
>
>
> On Mon, Mar 21, 2011 at 5:01 PM, Jeff McQuigg

Shyam Varan Nath

unread,
Mar 21, 2011, 7:31:14 PM3/21/11
to obiee-enterpri...@googlegroups.com, Sachin Jain, CPHIMS
Sachin
Federation and Drill through from Essbase are really two different options:

1) In federation the essbase and relational sources are treated as different sources to the logical fact table.  OBIEE / BI Server decides where the query needs to be re-directed to fetch the data.

2) The Essbase drill through via OBIEE can work via the guided navigation where OBIEE points to Essbase for summary and user can be "guided" to details in relational.   I do not think Essbase drill-thru (say using EIS) can be activated via OBIEE seamlessly.  It works great in EPM tools like SmartView or Excel Add-in.

Thanks
Shyam

Shyam Varan Nath

unread,
Mar 21, 2011, 7:27:55 PM3/21/11
to obiee-enterpri...@googlegroups.com, Kris, Jeff McQuigg
Well BIP gives static reports and no real "analysis" of BI can be done. So yes for OLTP reporting / operational reporting BIP is a good tool, but not for BI and Interactive analysis.
Thanks

Shyam Varan Nath

unread,
Mar 21, 2011, 6:33:42 PM3/21/11
to obiee-enterpri...@googlegroups.com, Jeff McQuigg


1 - microbatch ETL  [ possible to do for selected entities, change control can become complex, micro ETL can be done many times a day]
2 - map OBI to source  [ to 3rd normal source, discussed below]
3 - map OBI to replica of source  [ large companies make DR kinda backup of OLTP and use for reporting, may not be easily suited for smaller companies and needs double the disk space for one, no load on OLTP of reporting]
4 - map OBI to a transformed and enhanced version of the source [ seems like the ODS approach, is often done, but does not store historical or analytic info, good for multiple sources]
5 - federate across a DW and source   [ technically possible, makes RPD little complex as hard to join id's across DW pk's and OLTP pk's, join on business keys may have to be done..]

And what about mapping OBI directly to a 3NF model?  What about
getting rid of DW/stars all together and just using the source?

[ Noetix and EIS are the vendors who do that for operational reporting. Oracle's Fusion Analytics - now obsolete did that too. DBI's kinda did that for EBS
Advantage is no real ETL engine needed, but works well for operational, not really analytics reporting like year over year etc.   Also OLTP gets loaded by reporting]


On Mon, Mar 21, 2011 at 6:01 PM, Jeff McQuigg <je...@brewpalace.com> wrote:

Ramesh Kumar

unread,
Mar 21, 2011, 8:05:49 PM3/21/11
to obiee-enterpri...@googlegroups.com
Even though it is not ideal to use OBIEE for real-time reporting, there may be cases where it becomes required to go for it. Even though Oracle has not officially killed the Discoverer product, there is this mind-set among customers that they need to get rid off Discoverer and move to OBIEE. If Discoverer queries already running against OLTP, does using OBIEE can make it worse? If yes, what could one do to avoid it.

Other issue is the complexity in RPD, if we try to do both DW and OLTOP together, there is definitely problems and issues. So, what about keeping DW portion of the RPD separate from OLTP. This approach may not be elegant, but certainly less complex.

Any thoughts.

Thanks
Ramesh
--
Ramesh Kumar
eAlliance Corporation
rku...@ealliancecorp.com
Mobile# 630-375-9454

Vlahos, Greg

unread,
Mar 21, 2011, 8:04:31 PM3/21/11
to obiee-enterpri...@googlegroups.com

Actually, you can integrate Essbase into the Business Model layer "on top of" the underlying relational detail in the same way you would use an aggregate fact table.  You do need to have the dimensional member names aligned with relational columns, but these relational columns can be logical calculated columns to apply appropriate prefixes that Essbase may require.  So Essbase can be implemented in a seamless manner, where the user is unaware that they are drilling across two sources.

 

However, back on the real-time reporting point, a solution with Essbase could realistically only get near real time using the trickle feed load approach, but I would not consider it real time BI.  It is great for fast performance across summary data and this year vs. last year type analysis.

 

Thanks,

 

Greg Vlahos

Analytic Vision

 

 cid:image001.gif@01C96CCE.15D2D050

 

From: obiee-enterpri...@googlegroups.com [mailto:obiee-enterpri...@googlegroups.com] On Behalf Of Shyam Varan Nath
Sent: Monday, March 21, 2011 7:31 PM
To: obiee-enterpri...@googlegroups.com
Cc: Sachin Jain, CPHIMS
Subject: Re: [OBIEE EMG] Real-time OBI

 

Sachin

image001.gif

GERARD

unread,
Mar 22, 2011, 6:47:47 AM3/22/11
to OBIEE Enterprise Methodology Group
Real-time is often for reporting and not for analysis.
The good question is do you need real-time for your analytics.

You can find a good article which talk about right-time in place of
real-time.
http://adtmag.com/Articles/2003/09/29/BI-Real-time-or-right-time.aspx

I don't think that an analysis on a month basis need real time for
instance.
For a week analysis may be two refresh by day can be enough.

What are the needs ?

Otherwise, a 3NF with a good machine using goldengate, partition and
parallelism (for a full partition wise)
can be a solution.

Cheers
Nico







Robert Tooker

unread,
Mar 22, 2011, 6:04:05 AM3/22/11
to obiee-enterpri...@googlegroups.com
A very timely question! I am looking at this exact thing right now.

I don't think federation on its own is a solution for real-time reporting, unless you are happy that your aggregated data contains different/less information that your detail data and drill through will yield different results to your summary query.

If you have any sort of volume and/or complex transform then this is all about compromise. You need to split your real time requirements from your analytic requirements. You might have things like:

  - Operational reports - I need a list of things to know how to allocate my resources this afternoon
  - Up to the minutes measures - in some scenarios I need to know my sales results as of 10 seconds ago

One is easiest. The solutions I'm considering are:
  1) Application push back - these should be application functions. But back in the real world ..
  2) Mapping the OLTP into the repository in a separate business model to the OLAP database - forming your 3NF into stars in the logical layer. Issue here is that you can not control the SQL generated and you will end up with some messy joins. Also both will be entirely independent of the other business model (no conformed dimensions).
  3) Using a series of standalone views and mapping those into standalone folders within a subject area. Messy but workable.
  4) Direct database requests.
  5) BI Publisher

At the moment I think 5 is the only way to go - you can control the sql, trigger procedures before and after report run (eg if you need to stage the data at report run time - still real time but can take some load off the oltp for very complex queries), and keep Answers for analytics. Obviously you lose flexibility and put the burden of report writing back in IT (or at least the bip data model) - but that's the compromise. The problems I have with 2 and 3 are that end users will inevitably start using these for analytic queries and performance will fast become a problem. Once you put something in it's hard to take it away.

The final idea I have for "up to the minute" is a measure that contains only the information not processed by the ETL directly from the OLTP and with limited overlay on selected fields in selected dimensions (no type IIs here!) - eg using an in_date or an ID that will hit an index.  A second measure of OLAP + OLTP is then required which will be give you your up to minute results. This only really works if your OLTP data is very inserty and not very updatey (technical terms).

My general feeling is that this comes from the we want everything requirement. If you want a responsive BI tool then there's compromise, if not then there's a great query tool that gives you access to everything in the database in real time - SQL.

Regards,

Robert

Nish

unread,
Mar 22, 2011, 1:25:34 PM3/22/11
to OBIEE Enterprise Methodology Group
If we use any faster database machine like Exadata or Netezza, then I
think we can directly do reporting on snowflake schema without
worrying about performance.
I have heard that both of these can speed up queries by upto 600x in
some cases.

-Nish
> On Tue, Mar 22, 2011 at 12:05 AM, Ramesh Kumar <rkumar0...@gmail.com> wrote:
> > Even though it is not ideal to use OBIEE for real-time reporting, there may
> > be cases where it becomes required to go for it. Even though Oracle has not
> > officially killed the Discoverer product, there is this mind-set among
> > customers that they need to get rid off Discoverer and move to OBIEE. If
> > Discoverer queries already running against OLTP, does using OBIEE can make
> > it worse? If yes, what could one do to avoid it.
>
> > Other issue is the complexity in RPD, if we try to do both DW and OLTOP
> > together, there is definitely problems and issues. So, what about keeping DW
> > portion of the RPD separate from OLTP. This approach may not be elegant, but
> > certainly less complex.
>
> > Any thoughts.
>
> > Thanks
> > Ramesh
>
> > On Mon, Mar 21, 2011 at 5:33 PM, Shyam Varan Nath <shyamva...@gmail.com>wrote:
>
> >> 1 - microbatch ETL  [ possible to do for selected entities, change control
> >> can become complex, micro ETL can be done many times a day]
> >> 2 - map OBI to source  [ to 3rd normal source, discussed below]
> >> 3 - map OBI to replica of source  [ large companies make DR kinda backup
> >> of OLTP and use for reporting, may not be easily suited for smaller
> >> companies and needs double the disk space for one, no load on OLTP of
> >> reporting]
> >> 4 - map OBI to a transformed and enhanced version of the source [ seems
> >> like the ODS approach, is often done, but does not store historical or
> >> analytic info, good for multiple sources]
> >> 5 - federate across a DW and source   [ technically possible, makes RPD
> >> little complex as hard to join id's across DW pk's and OLTP pk's, join on
> >> business keys may have to be done..]
>
> >> And what about mapping OBI directly to a 3NF model?  What about
> >> getting rid of DW/stars all together and just using the source?
>
> >> [ Noetix and EIS are the vendors who do that for operational reporting.
> >> Oracle's Fusion Analytics - now obsolete did that too. DBI's kinda did that
> >> for EBS
> >> Advantage is no real ETL engine needed, but works well for operational,
> >> not really analytics reporting like year over year etc.   Also OLTP gets
> >> loaded by reporting]
>

chet justice

unread,
Mar 22, 2011, 1:37:10 PM3/22/11
to obiee-enterpri...@googlegroups.com, Nish
Excellent segue. I had a draft saved, but didn't want to hijack the thread necessarily. 

Jeff, remember this? 

http://greatobi.wordpress.com/2010/12/01/oow-bi-implementation-panel-questions-part-3/

I still believe that using Exadata can change the way you design systems from the ground up, with the possibility that you won't need a DW for every environment.

SCDs could be handled by an OLTP data model, START_DATE/END_DATE, where the current or active record has a NULL END_DATE.

Naturally, this is just me pondering, but I do believe, from what I have seen in regards to Exadata performance, that this might not be too far off. It is most definitely a great topic for discussion.

chet 

Jeff McQuigg

unread,
Mar 22, 2011, 2:18:21 PM3/22/11
to OBIEE Enterprise Methodology Group
Chet - yes big hardware can help out here and there (for a while that
is), but I don't think it will ever be a replacement for a true DW.

Others - Is anyone else doing real time work with GG or InfaCDC or
Other?

We've used InfaCDC on my current project, but we only really used it
for replication with minimal transformation. It was very easy to
develop with as you used regular INFA tools. The infrastructure and
tuning parameters were a bit more involved however. Finally, there is
a very tight coupling between source & target, which makes things very
interesting when the source is being upgraded. For example, what if
your 500 million row source table is enhanced with a new field? Does
that flood through your real-time mechanism and kill it? Does your
real-time tool even know about records that have been changed, lets
say in a non-journaled/logged portion of the database?


Does anyone have experiences with Micro-Batch ETL they'd like to
share?

How about some comments on mapping OBI to 3NF?

Jeff M.



Robert Tooker

unread,
Mar 22, 2011, 2:56:40 PM3/22/11
to obiee-enterpri...@googlegroups.com
I've done a little of mapping a 3NF db into OBIEE in a prototype situation and my experience is that it's fairly straightforward but you have to be a bit clever with federation in some situations. Say you have a header and an item table and some of your dimension foreign keys are on the header and some on the item. You need to model the pair as a single LTS to create your logical star, but then that creates an issue if you want to see the header information in a different context which is where federation comes in. With a bit of thought it's quite simple if your data model is simple - but some table relationships really need an ETL.

Issues are:
  - Performance. Most users will run a request, look at the data, then decide what to filter. Fine if using cubes or aggregated tables, not so good against an OLTP.
  - Prompts. In a dwh environment you generally load selective information from tables that may have multiple purposes (eg an oltp address table might have employees, customers, offices but in dwh these could be separate entities). Your oltp prompt is going to show everything by default, and it might be a performance issue to make it more selective.
  - Caching. If you want real time there's no point having a cache - so again performance becomes an issue.
  - History. Obvious one here.

I'm sure there are plenty of others - there are reasons we do what we do (I hope).

Regards,

Robert



Jeff M.



chet justice

unread,
Mar 22, 2011, 3:36:38 PM3/22/11
to obiee-enterpri...@googlegroups.com, Jeff McQuigg
I see I will have to corner you at a conference and bribe you with food or beer, or both.


Jeff M.



Stewart Bryson

unread,
Mar 23, 2011, 8:56:54 AM3/23/11
to obiee-enterpri...@googlegroups.com, chet justice, Jeff McQuigg
I did a presentation on this at Open World last year... and am doing it again at Collaborate. There are modeling combined with ETL techniques for this that converge around what Kimball calls the "real-time partition". Basically, you have separate segments for fact tables (and possibly dimensions in some configurations) that are the target for the micro-batch loads. This helps with performance, as these smaller, intra-day structures won't need to be indexed, etc. At the end of the day, there is a "closing the book" process, where late-arriving dimensions are handled, physical structures like indexes, etc., and then the intra-day table is partition-exchanged in.

In this presentation, I compare this technique with the OBIEE federation techniques and discuss the pros and cons. Interestingly, there are a lot of different ways that OBIEE can support the real-time need. Multiple LTS's, one for the OLTP schema, one for the EDW, and possibly one for the "in-between state" if you have late-arriving dimensions, etc.

One requirement that I see as absolutely necessary is a CDC of some kind. Even if you are going to do the OBIEE federated approach, that should really be against a foundation layer, or ODS, of some kind, and not the actual source system. If you can report off a replicated version of the source system (using fast streaming techniques like Oracle GoldenGate) that actually sits in the same database as the EDW, then the sky is the limit, and performance is less of a concern. That's because OBIEE can push down as single UNION statements the federated SQL... and this makes all the difference for performance.

Of course, for serious versions of this, Exadata is the way to go. But, unless using set-based ETL, there is no help that Exadata will give to something like Informatica which does row-by-row processing, unless SQL overrides or "push-down" technology is used. Exadata will obviously help with the query performance, but not the micro-batch ETL if the "enterprise ETL" tools are used. You're much better off with OWB, ODI or any of the comparable set-based tools, which will get incredible gains from the Exadata hardware and software enhancements.

Stewart Bryson

Julio Navarro

unread,
Apr 23, 2011, 6:08:51 AM4/23/11
to OBIEE Enterprise Methodology Group
Excelent thread!!
Although in my opinion, there is no much need of real time BI these
days, it is an increasing requirement.
Anyway, when we analyze these kind of requirements it is important to
differenciate operational reporting (which can be solved using
Publisher, for instance) from real BI applications. Once the
requirement is classified as "BI", these are my opinions on how to
address it, according to the mentioned architectures:

1 - microbatch ETL - complexity
2 - map OBI to source - this can only be achieved for simple queries
in which there is no much calculation / transformation. If so, better
to use Publisher
3 - map OBI to replica of source - could be a solution, but usually
requires complex configuration and you may have performance problems,
as replica is not a BI modeled database. Maybe using materialized
views (FAST refreshed) is a better solution for Oracle data sources
4 - map OBI to a transformed and enhanced version of the source. Same
as previous item
5 - federate across a DW and source - performance problems

Maybe, the best solution is using a combination of DW / Data Mart and
micro ETL, as Stewart explained. You can split your DM (star) in an
operational layer - micro ETL provisioned - and an historic layer
(traditional DW). Actually we are now using a similar architecture to
incorporate user's adjustments to information previously loaded:

- A large fact table is provisioned on daily basis from source
systems. ETL processes are complex. Large fact table has indexes,
materialized views, etc.
- Once, information is presented to users, they can make some
"adjustments" that are stored in a replica fact table. There is no
aggregation for this replica fact table. These adjustments are
recorded on line using APEX, and store some traceability: used,
comments, date, etc.
- large fact table and replica fact table are federated in OBIEE

Regards.
> On Tue, Mar 22, 2011 at 3:36 PM, chet justice <chet.just...@gmail.com>wrote:
>
>
>
>
>
>
>
> > I see I will have to corner you at a conference and bribe you with food or
> > beer, or both.
>
Reply all
Reply to author
Forward
0 new messages