Message from discussion data cleansing: externally or internally?
Received: by 10.43.117.133 with SMTP id fm5mr24531063icc.7.1320452017827;
Fri, 04 Nov 2011 17:13:37 -0700 (PDT)
From: joel garry <joel-ga...@home.com>
Subject: Re: data cleansing: externally or internally?
Date: Fri, 4 Nov 2011 17:12:04 -0700 (PDT)
X-Trace: posting.google.com 1320452017 1639 127.0.0.1 (5 Nov 2011 00:13:37 GMT)
NNTP-Posting-Date: Sat, 5 Nov 2011 00:13:37 +0000 (UTC)
Injection-Info: e5g2000prf.googlegroups.com; posting-host=184.108.40.206; posting-account=tpQovAkAAABNoH5bwsZAiff2L0zxGwdv
X-HTTP-UserAgent: Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:220.127.116.11)
Gecko/20110614 Firefox/3.6.18 ( .NET CLR 3.5.30729),gzip(gfe)
Content-Type: text/plain; charset=ISO-8859-1
On Nov 3, 11:51=A0pm, geos <g...@nowhere.invalid> wrote:
> there is a big text file with dirty data. a company wants it to be
> clean. there are some known patterns expressed as like or regexp. I
> first thought about two approaches:
> 1) do this on the system level
> 2) or in a database
> for the latter case it looks to me that I could use external tables or
> load data into temporary table and then do the cleaning.
> I am looking for pros and cons of each variant. my intuition tells me
> that loading into temporary table would give the most flexibility but
> also take additional space. I am not sure about the other methods. I
> would appreciate your opinion about what I should pay attention to when
> choosing the other methods. how are they restricted in terms of
> performance, flexibility and capabilities (eg. multitable loading)? I am
> also interested in good practices and your experience in similar cases
> you can share.
> thank you,
> NOTE: Follow Up set to comp.databases.oracle.misc
Out of database. I've done this so many times, whenever the database
is the limiting factor I take it out of the db, clean, and put it back
in. Two biggest factors are redo generation and memory limitations,
the latter of which are handled better by unix and pipelining. Plus,
it's easier to split transforms into little pieces and pipe them
@home.com is bogus.