shillelagh.adapters.api package¶
Subpackages¶
- shillelagh.adapters.api.gsheets package
- Subpackages
- Submodules
- shillelagh.adapters.api.gsheets.adapter module
- shillelagh.adapters.api.gsheets.fields module
- shillelagh.adapters.api.gsheets.lib module
- shillelagh.adapters.api.gsheets.types module
- shillelagh.adapters.api.gsheets.typing module
- Module contents
Submodules¶
shillelagh.adapters.api.datasette module¶
An adapter to Datasette instances.
See https://datasette.io/ for more information.
- class shillelagh.adapters.api.datasette.DatasetteAPI(server_url: str, database: str, table: str)[source]¶
Bases:
Adapter
An adapter to Datasette instances (https://datasette.io/).
- get_columns() Dict[str, Field] [source]¶
Return the columns available in the table.
This method is called for every query, so make sure it’s cheap. For most (all?) tables this won’t change, so you can store it in an instance attribute.
- get_cost(filtered_columns: List[Tuple[str, Operator]], order: List[Tuple[str, Literal[Order.ASCENDING] | Literal[Order.DESCENDING]]]) int ¶
Estimate the query cost.
The base adapter returns a fixed cost, and custom adapter can implement their own cost estimation.
- get_data(bounds: Dict[str, Filter], order: List[Tuple[str, Literal[Order.ASCENDING] | Literal[Order.DESCENDING]]], limit: int | None = None, offset: int | None = None, **kwargs: Any) Iterator[Dict[str, Any]] [source]¶
Yield rows as adapter-specific types.
This method expects rows to be in the storage format. Eg, for the CSV adapter datetime columns would be stored (and yielded) as strings. The
get_rows
method will use the adapter fields to convert these values into native Python types (in this case, a properdatetime.datetime
).Missing values (NULLs) may be omitted from the dictionary; they will be replaced by
None
by the backend.
- static parse_uri(uri: str) Tuple[str, str, str] [source]¶
Parse table name, and return arguments to instantiate adapter.
- safe = True¶
- static supports(uri: str, fast: bool = True, **kwargs: Any) bool | None [source]¶
Return if a given table is supported by the adapter.
The discovery is done in 2 passes. First all adapters have their methods called with
fast=True
. On the first pass adapters should implement a cheap method, without any network calls.If no adapter returns
True
a second pass is made withfast=False
using only adapters that returnedNone
on the first pass. In this second pass adapters can perform network requests to get more information about the URI.The method receives the table URI, as well as the adapter connection arguments, eg:
>>> from shillelagh.backends.apsw.db import connect >>> connection = connect( ... ':memory:', ... adapter_kwargs={"gsheetsapi": {"catalog": ... {"table": "https://docs.google.com/spreadsheets/d/1/"}}}, ... )
This would call all adapters in order to find which one should handle the table
table
. The Gsheets adapter would be called with:>>> from shillelagh.adapters.api.gsheets.adapter import GSheetsAPI >>> GSheetsAPI.supports("table", fast=True, # first pass ... catalog={"table": "https://docs.google.com/spreadsheets/d/1"}) True
- supports_limit = True¶
- supports_offset = True¶
- shillelagh.adapters.api.datasette.get_field(value: Any) Field [source]¶
Return a Shillelagh
Field
based on the value type.
shillelagh.adapters.api.generic_json module¶
shillelagh.adapters.api.generic_xml module¶
shillelagh.adapters.api.github module¶
shillelagh.adapters.api.html_table module¶
shillelagh.adapters.api.preset module¶
shillelagh.adapters.api.s3select module¶
shillelagh.adapters.api.socrata module¶
An adapter to the Socrata Open Data API.
See https://dev.socrata.com/ for more information.
- class shillelagh.adapters.api.socrata.MetadataColumn[source]¶
Bases:
TypedDict
A dictionary with metadata about a Socrata API column.
- class shillelagh.adapters.api.socrata.Number(filters: Collection[Type[Filter]] | None = None, order: Order = Order.NONE, exact: bool = False)[source]¶
-
A type for numbers stored as strings.
The Socrata API will return numbers as strings. This custom field will convert between them and floats.
- db_api_type = 'NUMBER'¶
- format(value: float | None) str | None [source]¶
Convert from a native Python type to a DB type.
This should be the opposite of
parse
.
- parse(value: str | None) float | None [source]¶
Convert from a DB type to a native Python type.
Some databases might represent booleans as integers, or timestamps as strings. To convert those values to native Python types we call the
parse
method in the field associated with the column. Custom adapters can define their own derived fields to handle special formats.Eg, the Google Sheets API returns dates as strings in its response, using the format “Date(2018,0,1)” for “2018-01-01”. A custom field allows the adapter to simply return the original value, and have it being automatically converted to a
datetime.date
object.This is not a staticmethod because some types need extra information in order to parse a value. Eg, GSheets takes into consideration the timezone of the sheet when parsing timestamps.
- type = 'REAL'¶
- class shillelagh.adapters.api.socrata.SocrataAPI(netloc: str, dataset_id: str, app_token: str | None = None)[source]¶
Bases:
Adapter
An adapter to the Socrata Open Data API (https://dev.socrata.com/).
The API is used in many governmental websites, including the CDC. Queries can be sent in the “Socrata Query Language”, a small dialect of SQL.
- get_columns() Dict[str, Field] [source]¶
Return the columns available in the table.
This method is called for every query, so make sure it’s cheap. For most (all?) tables this won’t change, so you can store it in an instance attribute.
- get_cost(filtered_columns: List[Tuple[str, Operator]], order: List[Tuple[str, Literal[Order.ASCENDING] | Literal[Order.DESCENDING]]]) int ¶
Estimate the query cost.
The base adapter returns a fixed cost, and custom adapter can implement their own cost estimation.
- get_data(bounds: Dict[str, Filter], order: List[Tuple[str, Literal[Order.ASCENDING] | Literal[Order.DESCENDING]]], limit: int | None = None, offset: int | None = None, **kwargs: Any) Iterator[Dict[str, Any]] [source]¶
Yield rows as adapter-specific types.
This method expects rows to be in the storage format. Eg, for the CSV adapter datetime columns would be stored (and yielded) as strings. The
get_rows
method will use the adapter fields to convert these values into native Python types (in this case, a properdatetime.datetime
).Missing values (NULLs) may be omitted from the dictionary; they will be replaced by
None
by the backend.
- static parse_uri(uri: str) Tuple[str, str] | Tuple[str, str, str] [source]¶
Parse table name, and return arguments to instantiate adapter.
- safe = True¶
- supports_limit = True¶
- supports_offset = True¶
- shillelagh.adapters.api.socrata.get_field(col: MetadataColumn) Field [source]¶
Return a Shillelagh
Field
from a Socrata column.
shillelagh.adapters.api.system module¶
shillelagh.adapters.api.weatherapi module¶
An adapter to WeatherAPI (https://www.weatherapi.com/).
- class shillelagh.adapters.api.weatherapi.WeatherAPI(location: str, api_key: str, window: int = 7)[source]¶
Bases:
Adapter
An adapter for WeatherAPI (https://www.weatherapi.com/).
The adapter expects an URL like:
https://api.weatherapi.com/v1/history.json?key=$key&q=$location
Where
$key
is an API key (available for free), and$location
is a freeform value that can be a US Zipcode, UK Postcode, Canada Postalcode, IP address, Latitude/Longitude (decimal degree) or city name.- chance_of_rain = <shillelagh.fields.String object>¶
- chance_of_snow = <shillelagh.fields.String object>¶
- cloud = <shillelagh.fields.Integer object>¶
- dewpoint_c = <shillelagh.fields.Float object>¶
- dewpoint_f = <shillelagh.fields.Float object>¶
- feelslike_c = <shillelagh.fields.Float object>¶
- feelslike_f = <shillelagh.fields.Float object>¶
- get_cost(filtered_columns: List[Tuple[str, Operator]], order: List[Tuple[str, Literal[Order.ASCENDING] | Literal[Order.DESCENDING]]]) float [source]¶
Estimate the query cost.
The base adapter returns a fixed cost, and custom adapter can implement their own cost estimation.
- get_data(bounds: Dict[str, Filter], order: List[Tuple[str, Literal[Order.ASCENDING] | Literal[Order.DESCENDING]]], **kwargs: Any) Iterator[Dict[str, Any]] [source]¶
Yield rows as adapter-specific types.
This method expects rows to be in the storage format. Eg, for the CSV adapter datetime columns would be stored (and yielded) as strings. The
get_rows
method will use the adapter fields to convert these values into native Python types (in this case, a properdatetime.datetime
).Missing values (NULLs) may be omitted from the dictionary; they will be replaced by
None
by the backend.
- gust_kph = <shillelagh.fields.Float object>¶
- gust_mph = <shillelagh.fields.Float object>¶
- heatindex_c = <shillelagh.fields.Float object>¶
- heatindex_f = <shillelagh.fields.Float object>¶
- humidity = <shillelagh.fields.Integer object>¶
- is_day = <shillelagh.fields.IntBoolean object>¶
- static parse_uri(uri: str) Tuple[str] | Tuple[str, str] [source]¶
Parse table name, and return arguments to instantiate adapter.
- precip_in = <shillelagh.fields.Float object>¶
- precip_mm = <shillelagh.fields.Float object>¶
- pressure_in = <shillelagh.fields.Float object>¶
- pressure_mb = <shillelagh.fields.Float object>¶
- safe = True¶
- supports_limit = False¶
- supports_offset = False¶
- temp_c = <shillelagh.fields.Float object>¶
- temp_f = <shillelagh.fields.Float object>¶
- time = <shillelagh.fields.DateTime object>¶
- time_epoch = <shillelagh.fields.Float object>¶
- vis_km = <shillelagh.fields.Float object>¶
- vis_miles = <shillelagh.fields.Float object>¶
- will_it_rain = <shillelagh.fields.IntBoolean object>¶
- will_it_snow = <shillelagh.fields.IntBoolean object>¶
- wind_degree = <shillelagh.fields.Integer object>¶
- wind_dir = <shillelagh.fields.String object>¶
- wind_kph = <shillelagh.fields.Float object>¶
- wind_mph = <shillelagh.fields.Float object>¶
- windchill_c = <shillelagh.fields.Float object>¶
- windchill_f = <shillelagh.fields.Float object>¶
- shillelagh.adapters.api.weatherapi.combine_time_filters(bounds: Dict[str, Filter]) Range [source]¶
Combine both time filters together.
The adapter has two time columns that can be used to filter the data, “time” as a timestamp and “time_epoch” as a float. We convert the latter to a timestamp and combine the two filters into a single
Range
.