getting exception while running a query

204 views
Skip to first unread message

Pratibha Sharma

unread,
Mar 20, 2015, 6:26:41 PM3/20/15
to db...@googlegroups.com
I am running a very simple query :

!|Query| select * as x from CTG_DWH..AGG_MOD_TY11 limit 10  |
|x|
|test|

This is the error I am getting:

org.netezza.error.NzSQLException: ERROR:  'select * as x from CTG_DWH..AGG_MOD_TY11 limit  0'
error             ^ found "AS" (at char 10) expecting a keyword

	at org.netezza.internal.QueryExecutor.getNextResult(QueryExecutor.java:276)
	at org.netezza.internal.QueryExecutor.execute(QueryExecutor.java:73)
	at org.netezza.sql.NzConnection.execute(NzConnection.java:2673)
	at org.netezza.sql.NzPreparedStatament._execute(NzPreparedStatament.java:1059)
	at org.netezza.sql.NzPreparedStatament.prepare(NzPreparedStatament.java:1076)
	at org.netezza.sql.NzPreparedStatament.<init>(NzPreparedStatament.java:86)
	at org.netezza.sql.NzConnection.prepareStatement(NzConnection.java:997)
	at dbfit.api.AbstractDbEnvironment.createStatementWithBoundFixtureSymbols(AbstractDbEnvironment.java:107)
	at dbfit.fixture.Query.getDataTable(Query.java:43)
	at dbfit.fixture.RowSetFixture.doRows(RowSetFixture.java:92)
	at fit.Fixture.doTable(Fixture.java:155)
	at fitlibrary.traverse.AlienTraverseHandler.doTable(AlienTraverseHandler.java:18)
	at fitlibrary.traverse.workflow.DoTraverseInterpreter.interpretWholeTable(DoTraverseInterpreter.java:99)
	at fitlibrary.traverse.workflow.DoTraverseInterpreter.interpretWholeTable(DoTraverseInterpreter.java:87)
	at fitlibrary.DoFixture.interpretWholeTable(DoFixture.java:69)
	at fitlibrary.suite.InFlowPageRunner.run(InFlowPageRunner.java:34)
	at fitlibrary.DoFixture.interpretTables(DoFixture.java:42)
	at dbfit.DatabaseTest.interpretTables(DatabaseTest.java:26)
	at fit.Fixture.doTables(Fixture.java:80)
	at fit.FitServer.process(FitServer.java:81)
	at fit.FitServer.run(FitServer.java:56)
	at fit.FitServer.main(FitServer.java:41)



What am I doing wrong.

Thanks

Patrick Schoenmakers

unread,
Apr 2, 2015, 7:57:15 AM4/2/15
to db...@googlegroups.com
You're trying to alias a SELECT *. That's not valid sql for netezza.

Op vrijdag 20 maart 2015 23:26:41 UTC+1 schreef Pratibha Sharma:

DISCLAIMER:
The information contained in this message may be confidential or privileged and is only intended for the use of the addressee. If you are not the intended recipient, you are requested by Qualogy to report this to the sender and to delete the message and you are notified that the use or distribution of the information contained in this message is strictly prohibited and unlawful.

Priethi Alagesan

unread,
Jan 22, 2019, 12:35:32 PM1/22/19
to dbfit
Hello,

I am a beginner to Spark. I am facing the same issue with the below error while running the query in sandbox. 

Py4JJavaError: An error occurred while calling o1201.load.

: org.netezza.error.NzSQLException: ERROR:  'SELECT * FROM......
..........
..........

error                               ^ found "table" (at char 28) expecting `CROSS' or `FULL' or `INNER_P' or `JOIN' or `LEFT'
at org.netezza.internal.QueryExecutor.getNextResult(QueryExecutor.java:287)

        at org.netezza.internal.QueryExecutor.execute(QueryExecutor.java:76)

        at org.netezza.sql.NzConnection.execute(NzConnection.java:2904)

        at org.netezza.sql.NzPreparedStatament._execute(NzPreparedStatament.java:1163)

        at org.netezza.sql.NzPreparedStatament.prepare(NzPreparedStatament.java:1180)

        at org.netezza.sql.NzPreparedStatament.<init>(NzPreparedStatament.java:96)

        at org.netezza.sql.NzConnection.prepareStatement(NzConnection.java:1106)

        at org.apache.spark.sql.execution.datasources.jdbc.JDBCRDD$.resolveTable(JDBCRDD.scala:60)

        at org.apache.spark.sql.execution.datasources.jdbc.JDBCRelation.<init>(JDBCRelation.scala:114)

        at org.apache.spark.sql.execution.datasources.jdbc.JdbcRelationProvider.createRelation(JdbcRelationProvider.scala:52)

        at org.apache.spark.sql.execution.datasources.DataSource.resolveRelation(DataSource.scala:309)

        at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:178)

        at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:146)

        at sun.reflect.GeneratedMethodAccessor32.invoke(Unknown Source)

        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)

        at java.lang.reflect.Method.invoke(Method.java:498)

        at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244)

        at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)

        at py4j.Gateway.invoke(Gateway.java:280)

        at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132)

        at py4j.commands.CallCommand.execute(CallCommand.java:79)

        at py4j.GatewayConnection.run(GatewayConnection.java:214)

        at java.lang.Thread.run(Thread.java:745)

While running the same query from SAS EGuide, it is working fine. I am struggling past few weeks to get it resolved. Your help would be highly appreciable. 
Thanks in Advance

Reply all
Reply to author
Forward
0 new messages