On May 16, 2012, at 2:20 PM, Eduardo wrote:
>
>
> Am Mittwoch, 16. Mai 2012 19:59:10 UTC+2 schrieb Michael Bayer:
> If I put count() the result will be the number of hits I would like to have both the number of hits and resulting column values for example
you'd need to use GROUP BY for that. if you say "SELECT COUNT(*) FROM <anything>", you get just one column back - the number of rows in <anything>.
> If I execute it with
> Session().query(
> mytable.c.col1,
> mytable.c.col2,
> mytable.c.col3).\
> distinct().\
> filter(mytable.c.col1=='something').\
> limit(1).offset(2).\
> count()
> I get only [33] (for example)
> if I use
> from_self()
> I get an error
> ProgrammingError: (ProgrammingError) column "anon_1.col1" must appear in the GROUP BY clause or be used in an aggregate function
can't reproduce - see sample:
from sqlalchemy import *
from sqlalchemy.orm import *
from sqlalchemy.ext.declarative import declarative_base
Base= declarative_base()
class A(Base):
__tablename__ = "a"
id = Column(Integer, primary_key=True)
col1 = Column(String)
col2 = Column(String)
col3 = Column(String)
e = create_engine("postgresql://scott:tiger@localhost/test", echo=True)
Base.metadata.create_all(e)
s = Session(e)
s.add_all([
A(col1='col1A', col2='col21', col3='col31'),
A(col1='col1A', col2='col22', col3='col32'),
A(col1='col1A', col2='col23', col3='col33'),
A(col1='col1B', col2='col24', col3='col34'),
])
print s.query(
A.col1,
A.col2,
A.col3).\
distinct().\
filter(A.col1=='col1A').\
limit(1).offset(2).\
from_self(func.count('*')).\
all()
output (after create table):
SELECT count(%(param_1)s) AS count_1
FROM (SELECT DISTINCT a.col1 AS a_col1, a.col2 AS a_col2, a.col3 AS a_col3
FROM a
WHERE a.col1 = %(col1_1)s
LIMIT %(param_2)s OFFSET %(param_3)s) AS anon_1
2012-05-16 14:51:03,099 INFO sqlalchemy.engine.base.Engine {'param_1': '*', 'col1_1': 'col1A', 'param_3': 2, 'param_2': 1}
[(1L,)]
it would be very helpful if you could supply actual specific working code here instead of excerpts, it takes a lot of time trying to guess exactly what issue you're having, thanks