shillelagh package

Subpackages

Submodules

shillelagh.console module

shillelagh.exceptions module

Exceptions defined in the DB API 2.0 spec.

exception shillelagh.exceptions.DataError[source]

Bases: DatabaseError

Errors that are due to problems with the processed data.

Exception raised for errors that are due to problems with the processed data like division by zero, numeric value out of range, etc. It must be a subclass of DatabaseError.

exception shillelagh.exceptions.DatabaseError[source]

Bases: Error

Errors that are related to the database.

Exception raised for errors that are related to the database. It must be a subclass of Error.

exception shillelagh.exceptions.Error[source]

Bases: Exception

Base class of all other error exceptions.

Exception that is the base class of all other error exceptions. You can use this to catch all errors with one single except statement. Warnings are not considered errors and thus should not use this class as base. It must be a subclass of the Python StandardError (defined in the module exceptions).

exception shillelagh.exceptions.IntegrityError[source]

Bases: DatabaseError

Raised when the relational integrity of the database is affected.

Exception raised when the relational integrity of the database is affected, e.g. a foreign key check fails. It must be a subclass of DatabaseError.

exception shillelagh.exceptions.InterfaceError[source]

Bases: Error

Errors that are related to the database interface.

Exception raised for errors that are related to the database interface rather than the database itself. It must be a subclass of Error.

exception shillelagh.exceptions.InternalError[source]

Bases: DatabaseError

Raised when the database encounters an internal error.

Exception raised when the database encounters an internal error, e.g. the cursor is not valid anymore, the transaction is out of sync, etc. It must be a subclass of DatabaseError.

exception shillelagh.exceptions.NotSupportedError[source]

Bases: DatabaseError

Raised in case a method or database API is not supported.

Exception raised in case a method or database API was used which is not supported by the database, e.g. requesting a .rollback() on a connection that does not support transaction or has transactions turned off. It must be a subclass of DatabaseError.

exception shillelagh.exceptions.OperationalError[source]

Bases: DatabaseError

Errors that are related to the database’s operation.

Exception raised for errors that are related to the database’s operation and not necessarily under the control of the programmer, e.g. an unexpected disconnect occurs, the data source name is not found, a transaction could not be processed, a memory allocation error occurred during processing, etc. It must be a subclass of DatabaseError.

exception shillelagh.exceptions.ProgrammingError[source]

Bases: DatabaseError

Raised for programming errors.

Exception raised for programming errors, e.g. table not found or already exists, syntax error in the SQL statement, wrong number of parameters specified, etc. It must be a subclass of DatabaseError.

exception shillelagh.exceptions.Warning[source]

Bases: Exception

Important warnings like data truncations while inserting.

Exception raised for important warnings like data truncations while inserting, etc. It must be a subclass of the Python StandardError (defined in the module exceptions).

shillelagh.fields module

Fields representing columns of different types and capabilities.

class shillelagh.fields.Blob(filters: Collection[Type[Filter]] | None = None, order: Order = Order.NONE, exact: bool = False)[source]

Bases: Field[bytes, bytes]

A blob of bytes.

This is used to represent binary data.

db_api_type = 'BINARY'
quote(value: bytes | None) str[source]

Quote values.

This method is used by some adapters to build a SQL expression. Eg, GSheets represents dates (and other time related types) with the prefix “date”:

SELECT A, B WHERE C = date '2018-01-01'

In order to handle that, the adapter defines its own time fields with custom quote methods.

This is only needed for adapters that use the build_sql helper function, where SQL is generated manually and sent to an API endpoint.

type = 'BLOB'
class shillelagh.fields.Boolean(filters: Collection[Type[Filter]] | None = None, order: Order = Order.NONE, exact: bool = False)[source]

Bases: Field[bool, bool]

A boolean.

db_api_type = 'NUMBER'
quote(value: bool | None) str[source]

Quote values.

This method is used by some adapters to build a SQL expression. Eg, GSheets represents dates (and other time related types) with the prefix “date”:

SELECT A, B WHERE C = date '2018-01-01'

In order to handle that, the adapter defines its own time fields with custom quote methods.

This is only needed for adapters that use the build_sql helper function, where SQL is generated manually and sent to an API endpoint.

type = 'BOOLEAN'
class shillelagh.fields.Date(filters: Collection[Type[Filter]] | None = None, order: Order = Order.NONE, exact: bool = False)[source]

Bases: Field[date, date]

A date.

This field is used in adapters that use datetime.date as the internal representation of dates.

db_api_type = 'DATETIME'
quote(value: date | None) str[source]

Quote values.

This method is used by some adapters to build a SQL expression. Eg, GSheets represents dates (and other time related types) with the prefix “date”:

SELECT A, B WHERE C = date '2018-01-01'

In order to handle that, the adapter defines its own time fields with custom quote methods.

This is only needed for adapters that use the build_sql helper function, where SQL is generated manually and sent to an API endpoint.

type = 'DATE'
class shillelagh.fields.DateTime(filters: Collection[Type[Filter]] | None = None, order: Order = Order.NONE, exact: bool = False)[source]

Bases: Field[datetime, datetime]

A timestamp.

This field is used in adapters that use datetime.datetime as the internal representation of timestamps.

db_api_type = 'DATETIME'
quote(value: datetime | None) str[source]

Quote values.

This method is used by some adapters to build a SQL expression. Eg, GSheets represents dates (and other time related types) with the prefix “date”:

SELECT A, B WHERE C = date '2018-01-01'

In order to handle that, the adapter defines its own time fields with custom quote methods.

This is only needed for adapters that use the build_sql helper function, where SQL is generated manually and sent to an API endpoint.

type = 'TIMESTAMP'
class shillelagh.fields.FastISODateTime(filters: Collection[Type[Filter]] | None = None, order: Order = Order.NONE, exact: bool = False)[source]

Bases: ISODateTime

A faster but not fully compliant ISO timestamp parser.

This uses Python’s native datetime.datetime.fromisoformat, which doesn’t support arbitrary ISO 8601 strings. It’s used for serializing and deserializing into SQLite.

parse(value: str | None) datetime | None[source]

Convert from a DB type to a native Python type.

Some databases might represent booleans as integers, or timestamps as strings. To convert those values to native Python types we call the parse method in the field associated with the column. Custom adapters can define their own derived fields to handle special formats.

Eg, the Google Sheets API returns dates as strings in its response, using the format “Date(2018,0,1)” for “2018-01-01”. A custom field allows the adapter to simply return the original value, and have it being automatically converted to a datetime.date object.

This is not a staticmethod because some types need extra information in order to parse a value. Eg, GSheets takes into consideration the timezone of the sheet when parsing timestamps.

class shillelagh.fields.Field(filters: Collection[Type[Filter]] | None = None, order: Order = Order.NONE, exact: bool = False)[source]

Bases: Generic[Internal, External]

Represents a column in a table.

A field is probably the most important concept in Shillelagh. Fields are used to annotate types, indicate which columns are filterable/sortable, and convert data between different format.

  1. Type annotation

The most basic use of field is to indicate the types of columns in a given resource. For example, an adapter that connects to a database with two columns, a string and an integer, could look like this:

class SimpleAdapter(Adapter):

    string_col = String()
    integer_col = Integer()

For dynamic resources the columns might be generated dynamically, but the idea is the same:

class DynamicAdapter(Adapter):

    type_map = {
        "int": Integer(),
        "float": Float(),
        "string": String(),
    }

    def get_columns(self) -> Dict[str, Field]:
        columns = read_columns()
        return {
            column.name: type_map[column.type]
            for column in columns
        }
  1. Filterable/sortable columns

Most adapters can perform some kind of filtering/sorting on the data, return less data to the backend in order to optimize queries. Adapters indicate this on the fields, eg:

class FilteringAdapter(Adapter):

    timestamp_col = DateTime(
        filters=[Equal, Range],
        order=Order.ANY,
        exact=True,
    )
    values_col = Float()

The adapter above declares a column called timestamp_col that can be filtered using either an equality (== ‘2020-01-01T00:00:00’) or a range (>= ‘2020-01-01T00:00:00’). Because of this declaration the backend will delegate the filtering to the adapter, which will be responsible for translating the filters into API/filesystem calls to fulfill them.

Additionally, the timestamp column also declares an order of ANY, which means that the adapter can sort the data in any order requested by the backend. Fields can declare a static order (eg, Order.ASCENDING) or no order at all.

Finally, the field also indicates that the filtering is exact, and no post-filtering is needed to be done by the backend. In some cases it’s useful to have adapters perform an initial coarse filtering (say, at the daily level), and have the backend perform the final fine filtering (say, at the second level), to simplify the logic in the adapter.

  1. Data conversion

Fields are also responsible for converting data between different formats, so it can flow through layers. For example, an adapter might return booleans as strings, and these need to be converted back and forth to Python booleans. The StringBoolean field should be used in that case, and it will automatically convert the data to the format the adapter understands.

Similarly, the APSW backend only accepts types understood by SQLite: ints, floats, strings, and bytes. This means that the backend needs to convert between, eg, native Python booleans and integers. This is also done by using the parse and format methods from fields (IntBoolean in this case).

When creating new fields, the base class should declare the type of the “internal” representation (used by the adapter) and the “external” representation (native Python types). For example, if we have an adapter that stores numbers as strings we could define a new type:

class StringNumber(Field[str, float]):  # Field[internal, external]
    type = "REAL"
    db_api_type = "NUMBER"

    # internal -> external
    def parse(self, value: Optional[str]) -> Optional[float]:
        return value if value is None else float(value)

    # external -> internal
    def format(self, value: Optional[float]) -> Optional[str]:
        return value if value is None else str(value)

Then the adapter can declare columns using that field:

class SomeAdapter(Adapter):

    number_col = StringNumber()

With this, it can simply return rows with the number as a string, without having to explicitly do the conversion:

{"rowid": 0, "number_col": "1.0"}
db_api_type = 'DBAPIType'
format(value: External | None) Internal | None[source]

Convert from a native Python type to a DB type.

This should be the opposite of parse.

parse(value: Internal | None) External | None[source]

Convert from a DB type to a native Python type.

Some databases might represent booleans as integers, or timestamps as strings. To convert those values to native Python types we call the parse method in the field associated with the column. Custom adapters can define their own derived fields to handle special formats.

Eg, the Google Sheets API returns dates as strings in its response, using the format “Date(2018,0,1)” for “2018-01-01”. A custom field allows the adapter to simply return the original value, and have it being automatically converted to a datetime.date object.

This is not a staticmethod because some types need extra information in order to parse a value. Eg, GSheets takes into consideration the timezone of the sheet when parsing timestamps.

quote(value: Internal | None) str[source]

Quote values.

This method is used by some adapters to build a SQL expression. Eg, GSheets represents dates (and other time related types) with the prefix “date”:

SELECT A, B WHERE C = date '2018-01-01'

In order to handle that, the adapter defines its own time fields with custom quote methods.

This is only needed for adapters that use the build_sql helper function, where SQL is generated manually and sent to an API endpoint.

type = ''
class shillelagh.fields.Float(filters: Collection[Type[Filter]] | None = None, order: Order = Order.NONE, exact: bool = False)[source]

Bases: Field[float, float]

A float.

db_api_type = 'NUMBER'
type = 'REAL'
class shillelagh.fields.ISODate(filters: Collection[Type[Filter]] | None = None, order: Order = Order.NONE, exact: bool = False)[source]

Bases: Field[str, date]

A date.

This field is used in adapters that use an ISO string as the internal representation of dates. SQLite, for example, has no concept of datetime.date objects, so we need to convert between the object and an ISO string when the data flows through SQLite.

db_api_type = 'DATETIME'
format(value: date | None) str | None[source]

Convert from a native Python type to a DB type.

This should be the opposite of parse.

parse(value: str | None) date | None[source]

Convert from a DB type to a native Python type.

Some databases might represent booleans as integers, or timestamps as strings. To convert those values to native Python types we call the parse method in the field associated with the column. Custom adapters can define their own derived fields to handle special formats.

Eg, the Google Sheets API returns dates as strings in its response, using the format “Date(2018,0,1)” for “2018-01-01”. A custom field allows the adapter to simply return the original value, and have it being automatically converted to a datetime.date object.

This is not a staticmethod because some types need extra information in order to parse a value. Eg, GSheets takes into consideration the timezone of the sheet when parsing timestamps.

quote(value: str | None) str[source]

Quote values.

This method is used by some adapters to build a SQL expression. Eg, GSheets represents dates (and other time related types) with the prefix “date”:

SELECT A, B WHERE C = date '2018-01-01'

In order to handle that, the adapter defines its own time fields with custom quote methods.

This is only needed for adapters that use the build_sql helper function, where SQL is generated manually and sent to an API endpoint.

type = 'DATE'
class shillelagh.fields.ISODateTime(filters: Collection[Type[Filter]] | None = None, order: Order = Order.NONE, exact: bool = False)[source]

Bases: Field[str, datetime]

A timestamp.

This field is used in adapters that use an ISO string as the internal representation of dates. SQLite, for example, has no concept of datetime.datetime objects, so we need to convert between the object and an ISO string when the data flows through SQLite.

db_api_type = 'DATETIME'
format(value: datetime | None) str | None[source]

Convert from a native Python type to a DB type.

This should be the opposite of parse.

parse(value: str | None) datetime | None[source]

Convert from a DB type to a native Python type.

Some databases might represent booleans as integers, or timestamps as strings. To convert those values to native Python types we call the parse method in the field associated with the column. Custom adapters can define their own derived fields to handle special formats.

Eg, the Google Sheets API returns dates as strings in its response, using the format “Date(2018,0,1)” for “2018-01-01”. A custom field allows the adapter to simply return the original value, and have it being automatically converted to a datetime.date object.

This is not a staticmethod because some types need extra information in order to parse a value. Eg, GSheets takes into consideration the timezone of the sheet when parsing timestamps.

quote(value: str | None) str[source]

Quote values.

This method is used by some adapters to build a SQL expression. Eg, GSheets represents dates (and other time related types) with the prefix “date”:

SELECT A, B WHERE C = date '2018-01-01'

In order to handle that, the adapter defines its own time fields with custom quote methods.

This is only needed for adapters that use the build_sql helper function, where SQL is generated manually and sent to an API endpoint.

type = 'TIMESTAMP'
class shillelagh.fields.ISOTime(filters: Collection[Type[Filter]] | None = None, order: Order = Order.NONE, exact: bool = False)[source]

Bases: Field[str, time]

A time of the day.

This field is used in adapters that use an ISO string as the internal representation of dates. SQLite, for example, has no concept of datetime.time objects, so we need to convert between the object and an ISO string when the data flows through SQLite.

db_api_type = 'DATETIME'
format(value: time | None) str | None[source]

Convert from a native Python type to a DB type.

This should be the opposite of parse.

parse(value: str | None) time | None[source]

Convert from a DB type to a native Python type.

Some databases might represent booleans as integers, or timestamps as strings. To convert those values to native Python types we call the parse method in the field associated with the column. Custom adapters can define their own derived fields to handle special formats.

Eg, the Google Sheets API returns dates as strings in its response, using the format “Date(2018,0,1)” for “2018-01-01”. A custom field allows the adapter to simply return the original value, and have it being automatically converted to a datetime.date object.

This is not a staticmethod because some types need extra information in order to parse a value. Eg, GSheets takes into consideration the timezone of the sheet when parsing timestamps.

quote(value: str | None) str[source]

Quote values.

This method is used by some adapters to build a SQL expression. Eg, GSheets represents dates (and other time related types) with the prefix “date”:

SELECT A, B WHERE C = date '2018-01-01'

In order to handle that, the adapter defines its own time fields with custom quote methods.

This is only needed for adapters that use the build_sql helper function, where SQL is generated manually and sent to an API endpoint.

type = 'TIME'
class shillelagh.fields.IntBoolean(filters: Collection[Type[Filter]] | None = None, order: Order = Order.NONE, exact: bool = False)[source]

Bases: Field[int, bool]

A boolean.

This field is used in adapters that represent booleans as an integer. SQLite, eg, has no boolean type, using 1 and 0 to represent true and false, respectively.

db_api_type = 'NUMBER'
format(value: bool | None) int | None[source]

Convert from a native Python type to a DB type.

This should be the opposite of parse.

parse(value: int | None) bool | None[source]

Convert from a DB type to a native Python type.

Some databases might represent booleans as integers, or timestamps as strings. To convert those values to native Python types we call the parse method in the field associated with the column. Custom adapters can define their own derived fields to handle special formats.

Eg, the Google Sheets API returns dates as strings in its response, using the format “Date(2018,0,1)” for “2018-01-01”. A custom field allows the adapter to simply return the original value, and have it being automatically converted to a datetime.date object.

This is not a staticmethod because some types need extra information in order to parse a value. Eg, GSheets takes into consideration the timezone of the sheet when parsing timestamps.

quote(value: int | None) str[source]

Quote values.

This method is used by some adapters to build a SQL expression. Eg, GSheets represents dates (and other time related types) with the prefix “date”:

SELECT A, B WHERE C = date '2018-01-01'

In order to handle that, the adapter defines its own time fields with custom quote methods.

This is only needed for adapters that use the build_sql helper function, where SQL is generated manually and sent to an API endpoint.

type = 'BOOLEAN'
class shillelagh.fields.Integer(filters: Collection[Type[Filter]] | None = None, order: Order = Order.NONE, exact: bool = False)[source]

Bases: Field[int, int]

An integer.

db_api_type = 'NUMBER'
type = 'INTEGER'
class shillelagh.fields.Order(value, names=None, *, module=None, qualname=None, type=None, start=1, boundary=None)[source]

Bases: Enum

An enum for different orders a field can have.

ANY = 'any'
ASCENDING = 'ascending'
DESCENDING = 'descending'
NONE = 'none'
class shillelagh.fields.RowID(filters: Collection[Type[Filter]] | None = None, order: Order = Order.NONE, exact: bool = False)[source]

Bases: Integer

Custom field for the row ID.

All Shillelagh adapters return a special column for the row ID. In many cases it’s just an increasing integer, but it’s used for DML in adapters that support it.

db_api_type = 'ROWID'
class shillelagh.fields.String(filters: Collection[Type[Filter]] | None = None, order: Order = Order.NONE, exact: bool = False)[source]

Bases: Field[str, str]

A string.

db_api_type = 'STRING'
quote(value: str | None) str[source]

Quote values.

This method is used by some adapters to build a SQL expression. Eg, GSheets represents dates (and other time related types) with the prefix “date”:

SELECT A, B WHERE C = date '2018-01-01'

In order to handle that, the adapter defines its own time fields with custom quote methods.

This is only needed for adapters that use the build_sql helper function, where SQL is generated manually and sent to an API endpoint.

type = 'TEXT'
class shillelagh.fields.StringBlob(filters: Collection[Type[Filter]] | None = None, order: Order = Order.NONE, exact: bool = False)[source]

Bases: Field[str, bytes]

A blob of bytes.

This field is used in adapters that represent binary data as a string with the hexadecimal representation of the bytes.

db_api_type = 'BINARY'
format(value: bytes | None) str | None[source]

Convert from a native Python type to a DB type.

This should be the opposite of parse.

parse(value: str | None) bytes | None[source]

Convert from a DB type to a native Python type.

Some databases might represent booleans as integers, or timestamps as strings. To convert those values to native Python types we call the parse method in the field associated with the column. Custom adapters can define their own derived fields to handle special formats.

Eg, the Google Sheets API returns dates as strings in its response, using the format “Date(2018,0,1)” for “2018-01-01”. A custom field allows the adapter to simply return the original value, and have it being automatically converted to a datetime.date object.

This is not a staticmethod because some types need extra information in order to parse a value. Eg, GSheets takes into consideration the timezone of the sheet when parsing timestamps.

quote(value: str | None) str[source]

Quote values.

This method is used by some adapters to build a SQL expression. Eg, GSheets represents dates (and other time related types) with the prefix “date”:

SELECT A, B WHERE C = date '2018-01-01'

In order to handle that, the adapter defines its own time fields with custom quote methods.

This is only needed for adapters that use the build_sql helper function, where SQL is generated manually and sent to an API endpoint.

type = 'BLOB'
class shillelagh.fields.StringBoolean(filters: Collection[Type[Filter]] | None = None, order: Order = Order.NONE, exact: bool = False)[source]

Bases: Field[str, bool]

A boolean.

This field is used in adapters that represent booleans as strings, eg, “TRUE” and “FALSE”.

db_api_type = 'NUMBER'
format(value: bool | None) str | None[source]

Convert from a native Python type to a DB type.

This should be the opposite of parse.

parse(value: str | None) bool | None[source]

Convert from a DB type to a native Python type.

Some databases might represent booleans as integers, or timestamps as strings. To convert those values to native Python types we call the parse method in the field associated with the column. Custom adapters can define their own derived fields to handle special formats.

Eg, the Google Sheets API returns dates as strings in its response, using the format “Date(2018,0,1)” for “2018-01-01”. A custom field allows the adapter to simply return the original value, and have it being automatically converted to a datetime.date object.

This is not a staticmethod because some types need extra information in order to parse a value. Eg, GSheets takes into consideration the timezone of the sheet when parsing timestamps.

quote(value: str | None) str[source]

Quote values.

This method is used by some adapters to build a SQL expression. Eg, GSheets represents dates (and other time related types) with the prefix “date”:

SELECT A, B WHERE C = date '2018-01-01'

In order to handle that, the adapter defines its own time fields with custom quote methods.

This is only needed for adapters that use the build_sql helper function, where SQL is generated manually and sent to an API endpoint.

static strtobool(val: str) bool[source]

Convert a string representation of truth to a boolean. True values are ‘y’, ‘yes’, ‘t’, ‘true’, ‘on’, and ‘1’. False values are ‘n’, ‘no’, ‘f’, ‘false’, ‘off’, and ‘0’. Raises ValueError if ‘val’ is anything else.

type = 'BOOLEAN'
class shillelagh.fields.StringDate(filters: Collection[Type[Filter]] | None = None, order: Order = Order.NONE, exact: bool = False)[source]

Bases: ISODate

A more permissive date format.

parse(value: str | None) date | None[source]

Convert from a DB type to a native Python type.

Some databases might represent booleans as integers, or timestamps as strings. To convert those values to native Python types we call the parse method in the field associated with the column. Custom adapters can define their own derived fields to handle special formats.

Eg, the Google Sheets API returns dates as strings in its response, using the format “Date(2018,0,1)” for “2018-01-01”. A custom field allows the adapter to simply return the original value, and have it being automatically converted to a datetime.date object.

This is not a staticmethod because some types need extra information in order to parse a value. Eg, GSheets takes into consideration the timezone of the sheet when parsing timestamps.

class shillelagh.fields.StringDateTime(filters: Collection[Type[Filter]] | None = None, order: Order = Order.NONE, exact: bool = False)[source]

Bases: ISODateTime

A more permissive datetime format.

parse(value: str | None) datetime | None[source]

Convert from a DB type to a native Python type.

Some databases might represent booleans as integers, or timestamps as strings. To convert those values to native Python types we call the parse method in the field associated with the column. Custom adapters can define their own derived fields to handle special formats.

Eg, the Google Sheets API returns dates as strings in its response, using the format “Date(2018,0,1)” for “2018-01-01”. A custom field allows the adapter to simply return the original value, and have it being automatically converted to a datetime.date object.

This is not a staticmethod because some types need extra information in order to parse a value. Eg, GSheets takes into consideration the timezone of the sheet when parsing timestamps.

class shillelagh.fields.StringDecimal(filters: Collection[Type[Filter]] | None = None, order: Order = Order.NONE, exact: bool = False)[source]

Bases: Field[str, Decimal]

Decimals as strings.

db_api_type = 'NUMBER'
format(value: Decimal | None) str | None[source]

Convert from a native Python type to a DB type.

This should be the opposite of parse.

parse(value: str | None) Decimal | None[source]

Convert from a DB type to a native Python type.

Some databases might represent booleans as integers, or timestamps as strings. To convert those values to native Python types we call the parse method in the field associated with the column. Custom adapters can define their own derived fields to handle special formats.

Eg, the Google Sheets API returns dates as strings in its response, using the format “Date(2018,0,1)” for “2018-01-01”. A custom field allows the adapter to simply return the original value, and have it being automatically converted to a datetime.date object.

This is not a staticmethod because some types need extra information in order to parse a value. Eg, GSheets takes into consideration the timezone of the sheet when parsing timestamps.

quote(value: str | None) str[source]

Quote values.

This method is used by some adapters to build a SQL expression. Eg, GSheets represents dates (and other time related types) with the prefix “date”:

SELECT A, B WHERE C = date '2018-01-01'

In order to handle that, the adapter defines its own time fields with custom quote methods.

This is only needed for adapters that use the build_sql helper function, where SQL is generated manually and sent to an API endpoint.

type = 'DECIMAL'
class shillelagh.fields.StringDuration(filters: Collection[Type[Filter]] | None = None, order: Order = Order.NONE, exact: bool = False)[source]

Bases: Field[str, timedelta]

A duration.

This field represents durations as a string.

db_api_type = 'DATETIME'
format(value: timedelta | None) str | None[source]

Convert from a native Python type to a DB type.

This should be the opposite of parse.

parse(value: str | None) timedelta | None[source]

Convert from a DB type to a native Python type.

Some databases might represent booleans as integers, or timestamps as strings. To convert those values to native Python types we call the parse method in the field associated with the column. Custom adapters can define their own derived fields to handle special formats.

Eg, the Google Sheets API returns dates as strings in its response, using the format “Date(2018,0,1)” for “2018-01-01”. A custom field allows the adapter to simply return the original value, and have it being automatically converted to a datetime.date object.

This is not a staticmethod because some types need extra information in order to parse a value. Eg, GSheets takes into consideration the timezone of the sheet when parsing timestamps.

quote(value: str | None) str[source]

Quote values.

This method is used by some adapters to build a SQL expression. Eg, GSheets represents dates (and other time related types) with the prefix “date”:

SELECT A, B WHERE C = date '2018-01-01'

In order to handle that, the adapter defines its own time fields with custom quote methods.

This is only needed for adapters that use the build_sql helper function, where SQL is generated manually and sent to an API endpoint.

type = 'DURATION'
class shillelagh.fields.StringInteger(filters: Collection[Type[Filter]] | None = None, order: Order = Order.NONE, exact: bool = False)[source]

Bases: Field[str, int]

Integers stored as strings.

SQLite can’t handle integers bigger than 2**64 (like Parquet’s INT96), so we default to storing them as strings.

db_api_type = 'NUMBER'
format(value: int | None) str | None[source]

Convert from a native Python type to a DB type.

This should be the opposite of parse.

parse(value: str | None) int | None[source]

Convert from a DB type to a native Python type.

Some databases might represent booleans as integers, or timestamps as strings. To convert those values to native Python types we call the parse method in the field associated with the column. Custom adapters can define their own derived fields to handle special formats.

Eg, the Google Sheets API returns dates as strings in its response, using the format “Date(2018,0,1)” for “2018-01-01”. A custom field allows the adapter to simply return the original value, and have it being automatically converted to a datetime.date object.

This is not a staticmethod because some types need extra information in order to parse a value. Eg, GSheets takes into consideration the timezone of the sheet when parsing timestamps.

type = 'INTEGER'
class shillelagh.fields.StringTime(filters: Collection[Type[Filter]] | None = None, order: Order = Order.NONE, exact: bool = False)[source]

Bases: ISOTime

A more permissive time format.

parse(value: str | None) time | None[source]

Convert from a DB type to a native Python type.

Some databases might represent booleans as integers, or timestamps as strings. To convert those values to native Python types we call the parse method in the field associated with the column. Custom adapters can define their own derived fields to handle special formats.

Eg, the Google Sheets API returns dates as strings in its response, using the format “Date(2018,0,1)” for “2018-01-01”. A custom field allows the adapter to simply return the original value, and have it being automatically converted to a datetime.date object.

This is not a staticmethod because some types need extra information in order to parse a value. Eg, GSheets takes into consideration the timezone of the sheet when parsing timestamps.

class shillelagh.fields.Time(filters: Collection[Type[Filter]] | None = None, order: Order = Order.NONE, exact: bool = False)[source]

Bases: Field[time, time]

A time of the day.

This field is used in adapters that use datetime.time as the internal representation of times of the day.

db_api_type = 'DATETIME'
quote(value: time | None) str[source]

Quote values.

This method is used by some adapters to build a SQL expression. Eg, GSheets represents dates (and other time related types) with the prefix “date”:

SELECT A, B WHERE C = date '2018-01-01'

In order to handle that, the adapter defines its own time fields with custom quote methods.

This is only needed for adapters that use the build_sql helper function, where SQL is generated manually and sent to an API endpoint.

type = 'TIME'

shillelagh.filters module

Filters for representing SQL predicates.

class shillelagh.filters.Endpoint(value: Any, include: bool, side: Side)[source]

Bases: object

One of the two endpoints of a Range.

Used to compare ranges. Eg, the range >10 can be represented by:

>>> start = Endpoint(10, False, Side.LEFT)
>>> end = Endpoint(None, True, Side.RIGHT)
>>> print(f'{start},{end}')
(10,∞]

The first endpoint represents the value 10 at the left side, in an open interval. The second endpoint represents infinity in this case.

class shillelagh.filters.Equal(value: Any)[source]

Bases: Filter

Equality comparison.

classmethod build(operations: Set[Tuple[Operator, Any]]) Filter[source]

Given a set of operations, build a filter:

>>> operations = [(Operator.GT, 10), (Operator.GT, 20)]
>>> print(Range.build(operations))
>20
check(value: Any) bool[source]

Test if a given filter matches a value:

>>> operations = [(Operator.GT, 10), (Operator.GT, 20)]
>>> filter_ = Range.build(operations)
>>> filter_.check(10)
False
>>> filter_.check(30)
True
operators: Set[Operator] = {Operator.EQ}
class shillelagh.filters.Filter[source]

Bases: object

A filter representing a SQL predicate.

classmethod build(operations: Set[Tuple[Operator, Any]]) Filter[source]

Given a set of operations, build a filter:

>>> operations = [(Operator.GT, 10), (Operator.GT, 20)]
>>> print(Range.build(operations))
>20
check(value: Any) bool[source]

Test if a given filter matches a value:

>>> operations = [(Operator.GT, 10), (Operator.GT, 20)]
>>> filter_ = Range.build(operations)
>>> filter_.check(10)
False
>>> filter_.check(30)
True
operators: Set[Operator] = {}
class shillelagh.filters.Impossible[source]

Bases: Filter

Custom Filter returned when impossible conditions are passed.

classmethod build(operations: Set[Tuple[Operator, Any]]) Filter[source]

Given a set of operations, build a filter:

>>> operations = [(Operator.GT, 10), (Operator.GT, 20)]
>>> print(Range.build(operations))
>20
check(value: Any) bool[source]

Test if a given filter matches a value:

>>> operations = [(Operator.GT, 10), (Operator.GT, 20)]
>>> filter_ = Range.build(operations)
>>> filter_.check(10)
False
>>> filter_.check(30)
True
class shillelagh.filters.IsNotNull[source]

Bases: Filter

Filter for IS NOT NULL.

classmethod build(operations: Set[Tuple[Operator, Any]]) Filter[source]

Given a set of operations, build a filter:

>>> operations = [(Operator.GT, 10), (Operator.GT, 20)]
>>> print(Range.build(operations))
>20
check(value: Any) bool[source]

Test if a given filter matches a value:

>>> operations = [(Operator.GT, 10), (Operator.GT, 20)]
>>> filter_ = Range.build(operations)
>>> filter_.check(10)
False
>>> filter_.check(30)
True
operators: Set[Operator] = {Operator.IS_NOT_NULL}
class shillelagh.filters.IsNull[source]

Bases: Filter

Filter for IS NULL.

classmethod build(operations: Set[Tuple[Operator, Any]]) Filter[source]

Given a set of operations, build a filter:

>>> operations = [(Operator.GT, 10), (Operator.GT, 20)]
>>> print(Range.build(operations))
>20
check(value: Any) bool[source]

Test if a given filter matches a value:

>>> operations = [(Operator.GT, 10), (Operator.GT, 20)]
>>> filter_ = Range.build(operations)
>>> filter_.check(10)
False
>>> filter_.check(30)
True
operators: Set[Operator] = {Operator.IS_NULL}
class shillelagh.filters.Like(value: Any)[source]

Bases: Filter

Substring searches.

classmethod build(operations: Set[Tuple[Operator, Any]]) Filter[source]

Given a set of operations, build a filter:

>>> operations = [(Operator.GT, 10), (Operator.GT, 20)]
>>> print(Range.build(operations))
>20
check(value: Any) bool[source]

Test if a given filter matches a value:

>>> operations = [(Operator.GT, 10), (Operator.GT, 20)]
>>> filter_ = Range.build(operations)
>>> filter_.check(10)
False
>>> filter_.check(30)
True
operators: Set[Operator] = {Operator.LIKE}
class shillelagh.filters.NotEqual(value: Any)[source]

Bases: Filter

Inequality comparison.

classmethod build(operations: Set[Tuple[Operator, Any]]) Filter[source]

Given a set of operations, build a filter:

>>> operations = [(Operator.GT, 10), (Operator.GT, 20)]
>>> print(Range.build(operations))
>20
check(value: Any) bool[source]

Test if a given filter matches a value:

>>> operations = [(Operator.GT, 10), (Operator.GT, 20)]
>>> filter_ = Range.build(operations)
>>> filter_.check(10)
False
>>> filter_.check(30)
True
operators: Set[Operator] = {Operator.NE}
class shillelagh.filters.Operator(value, names=None, *, module=None, qualname=None, type=None, start=1, boundary=None)[source]

Bases: Enum

Enum representing support comparisons.

EQ = '=='
GE = '>='
GT = '>'
IS_NOT_NULL = 'IS NOT NULL'
IS_NULL = 'IS NULL'
LE = '<='
LIKE = 'LIKE'
LIMIT = 'LIMIT'
LT = '<'
NE = '!='
OFFSET = 'OFFSET'
class shillelagh.filters.Range(start: Any | None = None, end: Any | None = None, include_start: bool = False, include_end: bool = False)[source]

Bases: Filter

A range comparison.

This filter represents a range, with an optional start and an optional end. Start and end can be inclusive or exclusive.

Ranges can be combined by adding them:

>>> range1 = Range(start=10)
>>> range2 = Range(start=20)
>>> print(range1 + range2)
>20
>>> range3 = Range(end=40)
>>> print(range2 + range3)
>20,<40
classmethod build(operations: Set[Tuple[Operator, Any]]) Filter[source]

Given a set of operations, build a filter:

>>> operations = [(Operator.GT, 10), (Operator.GT, 20)]
>>> print(Range.build(operations))
>20
check(value: Any) bool[source]

Test if a given filter matches a value:

>>> operations = [(Operator.GT, 10), (Operator.GT, 20)]
>>> filter_ = Range.build(operations)
>>> filter_.check(10)
False
>>> filter_.check(30)
True
operators: Set[Operator] = {Operator.EQ, Operator.GE, Operator.GT, Operator.LE, Operator.LT}
class shillelagh.filters.Side(value, names=None, *, module=None, qualname=None, type=None, start=1, boundary=None)[source]

Bases: Enum

Define the side of an interval endpoint.

LEFT = 'LEFT'
RIGHT = 'RIGHT'
shillelagh.filters.get_endpoints_from_operation(operator: Operator, value: Any) Tuple[Endpoint, Endpoint][source]

Returns endpoints from an operation.

shillelagh.functions module

Custom functions available to the SQL backend.

shillelagh.functions.get_metadata(adapter_kwargs: Dict[str, Dict[str, Any]], adapters: List[Type[Adapter]], uri: str) str[source]

Return metadata about a given table.

Returns the name of the adapter that supports the table, as well as any extra metadata provided by the adapter:

sql> SELECT GET_METADATA("https://docs.google.com/spreadsheets/d/1/edit#gid=0");
GET_METADATA("https://docs.google.com/spreadsheets/d/1/edit#gid=0")
-------------------------------------------------------------------
{
    "extra": {
        "Spreadsheet title": "A title",
        "Sheet title": "Another title"
    },
    "adapter": "GSheetsAPI"
}
shillelagh.functions.sleep(seconds: int) None[source]

Sleep for n seconds.

This is useful for troubleshooting timeouts:

sql> SELECT sleep(60);
shillelagh.functions.version() str[source]

Return the current version of Shillelagh.

As an example:

sql> SELECT VERSION();
VERSION()
-----------
0.7.4

shillelagh.lib module

Helper functions for Shillelagh.

shillelagh.lib.NetworkAPICostModel(download_cost: int, fixed_cost: int = 0)[source]

A cost model for adapters with network API calls.

In this case, transferring less data and doing less connections is more efficient.

class shillelagh.lib.RowIDManager(ranges: List[range])[source]

Bases: object

A row ID manager that tracks insert and deletes.

The RowIDManager should be used with an append-only table structure. It assigns a row ID to each row. When a new row is appended it will automatically receive a new ID. And when rows are deleted their ID gets changed to -1 to indicate the deletion.

An example:

>>> data = ["zero", "one", "two"]
>>> manager = RowIDManager([range(len(data))])

To insert data:

>>> data.append("three")
>>> manager.insert()
3
>>> data.append("four")
>>> manager.insert(10)  # you can specify a row ID
10
>>> for row_id, value in zip(manager, data):
...     if row_id != -1:
...         print(row_id, value)
0 zero
1 one
2 two
3 three
10 four

To delete data:

>>> manager.delete(data.index("two"))
>>> print(data)
['zero', 'one', 'two', 'three', 'four']
>>> for row_id, value in zip(manager, data):
...     if row_id != -1:
...         print(row_id, value)
0 zero
1 one
3 three
10 four
check_row_id(row_id: int) None[source]

Check if a provided row ID is not being used.

delete(row_id: int) None[source]

Mark a given row ID as deleted.

get_max_row_id() int[source]

Find the maximum row ID.

insert(row_id: int | None = None) int[source]

Insert a new row ID.

shillelagh.lib.SimpleCostModel(rows: int, fixed_cost: int = 0)[source]

A simple model for estimating query costs.

The model assumes that each filtering operation is O(n), and each sorting operation is O(n log n), in addition to a fixed cost.

shillelagh.lib.analyze(data: Iterator[Dict[str, Any]]) Tuple[int, Dict[str, Order], Dict[str, Type[Field]]][source]

Compute number of rows, order, and types from a stream of rows.

shillelagh.lib.apply_limit_and_offset(rows: Iterator[T], limit: int | None = None, offset: int | None = None) Iterator[T][source]

Apply limit/offset to a stream of rows.

shillelagh.lib.best_index_object_available() bool[source]

Check if support for best index object is available.

shillelagh.lib.build_sql(columns: Dict[str, Field], bounds: Dict[str, Filter], order: List[Tuple[str, Literal[Order.ASCENDING] | Literal[Order.DESCENDING]]], table: str | None = None, column_map: Dict[str, str] | None = None, limit: int | None = None, offset: int | None = None, alias: str | None = None) str[source]

Build a SQL query.

This is used by adapters which use a simplified SQL dialect to fetch data. For GSheets a column map is required, since the SQL references columns by label (“A”, “B”, etc.) instead of name.

shillelagh.lib.combine_args_kwargs(func: Callable[[...], Any], *args: Any, **kwargs: Any) Tuple[Any, ...][source]

Combine args and kwargs into args.

This is needed because we allow users to pass custom kwargs to adapters, but when creating the virtual table we serialize only args.

shillelagh.lib.deserialize(value: str) Any[source]

Deserialize adapter arguments.

This function is used by the SQLite backend, in order to deserialize the virtual table definition and instantiate an adapter.

shillelagh.lib.escape_identifier(value: str) str[source]

Escape double quotes.

shillelagh.lib.escape_string(value: str) str[source]

Escape single quotes.

shillelagh.lib.filter_data(data: Iterator[Dict[str, Any]], bounds: Dict[str, Filter], order: List[Tuple[str, Literal[Order.ASCENDING] | Literal[Order.DESCENDING]]], limit: int | None = None, offset: int | None = None, requested_columns: Set[str] | None = None) Iterator[Dict[str, Any]][source]

Apply filtering and sorting to a stream of rows.

This is used mostly as an exercise. It’s probably much more efficient to simply declare fields without any filtering/sorting and let the backend (SQLite, eg) handle it.

shillelagh.lib.find_adapter(uri: str, adapter_kwargs: Dict[str, Any], adapters: List[Type[Adapter]]) Tuple[Type[Adapter], Tuple[Any, ...], Dict[str, Any]][source]

Find an adapter that handles a given URI.

This is done in 2 passes: first the supports method is called with fast=True. If no adapter returns True we do a second pass on the plugins that returned None, passing fast=False so they can do network requests to better inspect the URI.

shillelagh.lib.flatten(row: Dict[str, Any]) Dict[str, Any][source]

Function that converts JSON to strings, to flatten rows.

shillelagh.lib.get_conditions(id_: str, field: Field, filter_: Filter) List[str][source]

Build a SQL condition from a column ID and a filter.

shillelagh.lib.get_session(request_headers: Dict[str, str], cache_name: str, expire_after: timedelta = datetime.timedelta(seconds=180)) CachedSession[source]

Return a cached session.

shillelagh.lib.is_not_null(column: Any, _: Any) bool[source]

Operator for IS NOT NULL.

shillelagh.lib.is_null(column: Any, _: Any) bool[source]

Operator for IS NULL.

shillelagh.lib.serialize(value: Any) str[source]

Serialize adapter arguments.

This function is used with the SQLite backend, in order to serialize the arguments needed to instantiate an adapter via a virtual table.

shillelagh.lib.unescape_identifier(value: str) str[source]

Unescape double quotes.

shillelagh.lib.unescape_string(value: str) str[source]

Unescape single quotes.

shillelagh.lib.update_order(current_order: Order, previous: Any, current: Any, num_rows: int) Order[source]

Update the stored order of a given column.

This is used to analyze the order of columns, by traversing the results and checking if their are sorted in any way.

shillelagh.types module

DB API 2.0 types for Shillelagh.

shillelagh.types.Binary(string: str) bytes[source]

constructs an object capable of holding a binary (long) string value.

class shillelagh.types.DBAPIType(name: str)[source]

Bases: object

Constructor for the required DB API 2.0 types.

shillelagh.types.Date(year: int, month: int, day: int) date[source]

Constructs an object holding a date value.

shillelagh.types.DateFromTicks(ticks: int) date[source]

Constructs an object holding a date value from the given ticks value.

Ticks should be in number of seconds since the epoch.

shillelagh.types.Time(hour: int, minute: int, second: int) time[source]

Constructs an object holding a time value.

shillelagh.types.TimeFromTicks(ticks: int) time[source]

Constructs an object holding a time value from the given ticks value.

Ticks should be in number of seconds since the epoch.

shillelagh.types.Timestamp(year: int, month: int, day: int, hour: int, minute: int, second: int) datetime[source]

Constructs an object holding a timestamp value.

shillelagh.types.TimestampFromTicks(ticks: int) datetime[source]

Constructs an object holding a timestamp value from the given ticks value.

Ticks should be in number of seconds since the epoch.

shillelagh.typing module

Custom types for Shillelagh.

Module contents