Data Flow Task Plus 2012

813 views
Skip to first unread message

RhiannaSilEl ap Mathonwy

unread,
Apr 16, 2013, 3:31:11 PM4/16/13
to coz...@googlegroups.com
Are there any known issues with using the Data Flow Task Plus component on 2012?  I'm receiving the following error when I try to map to a destination column with various column formats:

[Flat File Source [1]] Error: Data conversion failed. The data conversion for column "ACCOUNT_ID" returned status value 2 and status text "The value could not be converted because of a potential loss of data.".
[Flat File Source [1]] Error: SSIS Error Code DTS_E_INDUCEDTRANSFORMFAILUREONERROR.  The "Flat File Source.Outputs[Flat File Source Output].Columns[ACCOUNT_ID]" failed because error code 0xC0209084 occurred, and the error row disposition on "Flat File Source.Outputs[Flat File Source Output].Columns[ACCOUNT_ID]" specifies failure on error. An error occurred on the specified object of the specified component.  There may be error messages posted before this with more information about the failure.
[Flat File Source [1]] Error: An error occurred while processing file "\\windowsxp\SSIS_Refresh\FULL_REFRESH\alternate_id_test.txt" on data row 2.
[SSIS.Pipeline] Error: SSIS Error Code DTS_E_PRIMEOUTPUTFAILED.  The PrimeOutput method on Flat File Source returned error code 0xC0202092.  The component returned a failure code when the pipeline engine called PrimeOutput(). The meaning of the failure code is defined by the component, but the error is fatal and the pipeline stopped executing.  There may be error messages posted before this with more information about the failure.


....and receiving these errors when I try to put them in a staging table that I created using just varchar:

[Flat File Source [1]] Error: Data conversion failed. The data conversion for column "STS" returned status value 4 and status text "Text was truncated or one or more characters had no match in the target code page.".
[Flat File Source [1]] Error: The "Flat File Source.Outputs[Flat File Source Output].Columns[STS]" failed because truncation occurred, and the truncation row disposition on "Flat File Source.Outputs[Flat File Source Output].Columns[STS]" specifies failure on truncation. A truncation error occurred on the specified object of the specified component.
[Flat File Source [1]] Error: An error occurred while processing file "\\windowsxp\SSIS_Refresh\FULL_REFRESH\alternate_id_test.txt" on data row 2.
[SSIS.Pipeline] Error: SSIS Error Code DTS_E_PRIMEOUTPUTFAILED.  The PrimeOutput method on Flat File Source returned error code 0xC0202092.  The component returned a failure code when the pipeline engine called PrimeOutput(). The meaning of the failure code is defined by the component, but the error is fatal and the pipeline stopped executing.  There may be error messages posted before this with more information about the failure.

I'm creating a new ETL process and hoping to make it entirely dynamic since that will save a lot of time and possibly errors in the future. Any help would be really, really, really appreciated.

TIA!

Jakki

RhiannaSilEl ap Mathonwy

unread,
Apr 16, 2013, 4:43:47 PM4/16/13
to coz...@googlegroups.com
I managed to get past the first error, but am on to a new set of errors when trying to add another file to the mix:

[Flat File Source [123]] Error: An error occurred while skipping data rows.
[SSIS.Pipeline] Error: SSIS Error Code DTS_E_PRIMEOUTPUTFAILED.  The PrimeOutput method on Flat File Source returned error code 0xC0202091.  The component returned a failure code when the pipeline engine called PrimeOutput(). The meaning of the failure code is defined by the component, but the error is fatal and the pipeline stopped executing.  There may be error messages posted before this with more information about the failure.


The first file loads okay, no problem, but the second file fails.  Could it be an issue with column positions?  The columns are completely different between the two tables. Wondering if DF Plus is going to work for what I'm trying to do.  I really want to avoid coding 20 different data flow tasks since it will end up being a support nightmare in the long run.
Reply all
Reply to author
Forward
0 new messages