Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

Stored procedure

1 view
Skip to first unread message

JJ297

unread,
Oct 1, 2008, 12:05:53 PM10/1/08
to
Can someone help me get the right data out...

I want to compare last weeks Table Diary9-22 to this weeks Table
Diary9-29 to see if the same PAN (field name) are in both records

How would I write this?

Roy Harvey (SQL Server MVP)

unread,
Oct 1, 2008, 12:41:28 PM10/1/08
to
On Wed, 1 Oct 2008 09:05:53 -0700 (PDT), JJ297 <nc...@yahoo.com>
wrote:

Is PAN a unique column? Or may a value appear more than once?

If it is unique then this will show only the values that are in only
one of the tables.

SELECT A.PAN, B.PAM
FROM [Diary9-22] AS A
FULL OUTER
JOIN [Diary9-29] AS B
ON A.PAN = B.PAN
WHERE A.PAN IS NULL
OR B.PAN IS NULL;

If PAN is not unique you could simply add a DISTINCT to that query,
but depending on performance of that you might want to try applying
DISTINCT to each table before the join process. That could be done in
derived tables, or (in SQL Server 2005 or later) in Common Table
Expressions.

WITH
A AS
(
SELECT DISTINCT PAN
FROM [Diary9-22]
),
B AS
(
SELECT DISTINCT PAN
FROM [Diary9-29]
)
SELECT A.PAN, B.PAM
FROM A
FULL OUTER
JOIN B
ON A.PAN = B.PAN
WHERE A.PAN IS NULL
OR B.PAN IS NULL;

Roy Harvey
Beacon Falls, CT

JJ297

unread,
Oct 1, 2008, 1:58:18 PM10/1/08
to
On Oct 1, 12:41 pm, "Roy Harvey (SQL Server MVP)"

I am actually going to put this in SSIS so I want the duplicates to go
to a pending table and the other ones to go to the Cleared Table. I
figured I will get the SQL Statement working first in SQL then I can
get the info out of SSIS. Thanks!

JJ297

unread,
Oct 1, 2008, 2:12:00 PM10/1/08
to
> get the info out of SSIS.  Thanks!- Hide quoted text -
>
> - Show quoted text -

Okay doesn't look like it is giving me the correct info. The PAN
number is listed one time in each table. I want to combine both
tables and then take out the duplicates and put into a table called
pending. The one's that are only mentioned one time should then go
into a cleared table. Is this possible?

Plamen Ratchev

unread,
Oct 1, 2008, 2:22:43 PM10/1/08
to
To get all duplicates from both tables, you can run a query like this:

SELECT PAN
FROM (SELECT PAN FROM [Diary9-22]
UNION ALL
SELECT PAN FROM [Diary9-29]) AS T
GROUP BY PAN
HAVING COUNT(*) > 1;

To get the single occurrences is very similar:

SELECT PAN
FROM (SELECT PAN FROM [Diary9-22]
UNION ALL
SELECT PAN FROM [Diary9-29]) AS T
GROUP BY PAN
HAVING COUNT(*) = 1;

--
Plamen Ratchev
http://www.SQLStudio.com

JJ297

unread,
Oct 1, 2008, 2:29:40 PM10/1/08
to

Thanks that worked!

JJ297

unread,
Oct 1, 2008, 3:27:08 PM10/1/08
to
> Thanks that worked!- Hide quoted text -

>
> - Show quoted text -

How do I add this to my stored procedure to get duplicates out

DATEDIFF("dd",Out_DryDte1,GETDATE()) > 60 - 12

This isn't working:

SELECT PAN, Out_DryDte1
FROM (SELECT PAN FROM [NewDiary9_22]
UNION ALL
SELECT PAN FROM [NewDiary9_29]) AS T
GROUP BY PAN
HAVING COUNT(*) > 1 and (Out_DryDte1,Getdate()) > 60 -12

I want to get all of those records who are over 60 days - 12 days from
today's date. Hope that makes sense

Plamen Ratchev

unread,
Oct 1, 2008, 3:36:28 PM10/1/08
to
Try this:

SELECT PAN
FROM (SELECT PAN, Out_DryDte1 FROM [NewDiary9_22]
UNION ALL
SELECT PAN, Out_DryDte1 FROM [NewDiary9_29]) AS T
WHERE Out_DryDte1 < DATEADD(DAY,
DATEDIFF(DAY, 0, CURRENT_TIMESTAMP) -
(60 - 12), 0)


GROUP BY PAN
HAVING COUNT(*) > 1

Plamen Ratchev

unread,
Oct 1, 2008, 12:21:37 PM10/1/08
to
Below are a couple methods (the last one requires SQL Server 2005/2008).
Those will give you all PAN values that match in both tables.

Not sure why weekly data is stored in separate tables, a better approach
is to use a single table with date (or week number/year) column.

SELECT PAN
FROM [Diary9-22] AS A
WHERE EXISTS (SELECT *
FROM [Diary9-29] AS B
WHERE B.PAN = A.PAN);

SELECT A.PAN
FROM [Diary9-22] AS A
JOIN [Diary9-29] AS B
ON A.PAN = B.PAN;

SELECT PAN
FROM [Diary9-22]
INTERSECT
SELECT PAN
FROM [Diary9-29];

JJ297

unread,
Oct 2, 2008, 9:45:39 AM10/2/08
to

Okay thanks will try it now!

JJ297

unread,
Oct 2, 2008, 9:48:10 AM10/2/08
to
On Oct 1, 12:21 pm, Plamen Ratchev <Pla...@SQLStudio.com> wrote:

Oh weekly data is stored because they want to keep a copy of the
weekly files. So I need to join the tables to get the pending and
cleared files. Really I can just look for the cleared records (the
one's that are in Diary9-22 and not in Diary 9-29)

Thanks for the procedures.

Plamen Ratchev

unread,
Oct 2, 2008, 10:22:17 AM10/2/08
to
You can still keep all weekly data in one table, just need an extra
column to store the week start date. That will make the system a lot
more flexible (and you do not have to create a new table every week).
Right now you have to change your queries every week to reflect the
table name change.

JJ297

unread,
Oct 2, 2008, 11:19:29 AM10/2/08
to

Yes I would have to change the queries each week. I will set it up
the way you suggested. Thanks again for your help!

--CELKO--

unread,
Oct 2, 2008, 10:10:57 PM10/2/08
to
"A problem well stated is a problem half solved." -- Charles F.
Kettering

Please post DDL, so that people do not have to guess what the keys,
constraints, Declarative Referential Integrity, data types, etc. in
your schema are. If you know how, follow ISO-11179 data element naming
conventions and formatting rules. Temporal data should use ISO-8601
formats. Code should be in Standard SQL as much as possible and not
local dialect.

Sample data is also a good idea, along with clear specifications. It
is very hard to debug code when you do not let us see it. If you want
to learn how to ask a question on a Newsgroup, look at:
http://www.catb.org/~esr/faqs/smart-questions.html

Your narrative seems to describe a file (records and fields!!) and
tables that mimic weekly tapes from a 1950's style magnetic system.
Your table names even look like classic IBM tape labels based on a
date which violate the basics of RDBMS. We had a "YYDDD" format for
decades before RDBMS. There are some Y2K problems with this,
obviously.

0 new messages