Skip to content

Commit

Permalink
Adding limitSpec support to groupby query
Browse files Browse the repository at this point in the history
  • Loading branch information
mistercrunch committed Jun 29, 2015
1 parent 3b93c01 commit 97397d3
Showing 1 changed file with 11 additions and 2 deletions.
13 changes: 11 additions & 2 deletions pydruid/client.py
Original file line number Diff line number Diff line change
Expand Up @@ -303,6 +303,8 @@ def build_query(self, args):
query_dict['dataSource'] = val
elif key == 'paging_spec':
query_dict['pagingSpec'] = val
elif key == 'limit_spec':
query_dict['limitSpec'] = val
elif key == "filter":
query_dict[key] = Filter.build_filter(val)
elif key == "having":
Expand Down Expand Up @@ -435,20 +437,26 @@ def groupby(self, **kwargs):
:param pydruid.utils.having.Having having: Indicates which groups in results set of query to keep
:param post_aggregations: A dict with string key = 'post_aggregator_name', and value pydruid.utils.PostAggregator
:param dict context: A dict of query context options
:param dict limit_spec: A dict of parameters defining how to limit the rows returned, as specified in the Druid api documentation
Example:
.. code-block:: python
:linenos:
>>> group = query.groupby(
dataSource='twitterstream',
datasource='twitterstream',
granularity='hour',
intervals='2013-10-04/pt1h',
dimensions=["user_name", "reply_to_name"],
filter=~(Dimension("reply_to_name") == "Not A Reply"),
aggregations={"count": doublesum("count")},
context={"timeout": 1000}
limit_spec={
"type": "default",
"limit": 50,
"columns" : ["count"]
}
)
>>> for k in range(2):
... print group[k]
Expand All @@ -459,7 +467,8 @@ def groupby(self, **kwargs):
self.query_type = 'groupBy'
valid_parts = [
'datasource', 'granularity', 'filter', 'aggregations',
'having', 'post_aggregations', 'intervals', 'dimensions'
'having', 'post_aggregations', 'intervals', 'dimensions',
'limit_spec',
]
self.validate_query(valid_parts, kwargs)
self.build_query(kwargs)
Expand Down

0 comments on commit 97397d3

Please sign in to comment.