Working with Database Metadata¶
With engines and SQL execution down, we are ready to begin some Alchemy. The central element of both SQLAlchemy Core and ORM is the SQL Expression Language which allows for fluent, composable construction of SQL queries. The foundation for these queries are Python objects that represent database concepts like tables and columns. These objects are known collectively as database metadata.
The most common foundational objects for database metadata in SQLAlchemy are
known as MetaData
, Table
, and Column
.
The sections below will illustrate how these objects are used in both a
Core-oriented style as well as an ORM-oriented style.
ORM readers, stay with us!
As with other sections, Core users can skip the ORM sections, but ORM users
would best be familiar with these objects from both perspectives.
The Table
object discussed here is declared in a more indirect
(and also fully Python-typed) way when using the ORM, however there is still
a Table
object within the ORM’s configuration.
Setting up MetaData with Table objects¶
When we work with a relational database, the basic data-holding structure
in the database which we query from is known as a table.
In SQLAlchemy, the database “table” is ultimately represented
by a Python object similarly named Table
.
To start using the SQLAlchemy Expression Language, we will want to have
Table
objects constructed that represent all of the database
tables we are interested in working with. The Table
is
constructed programmatically, either directly by using the
Table
constructor, or indirectly by using ORM Mapped classes
(described later at Using ORM Declarative Forms to Define Table Metadata). There is also the
option to load some or all table information from an existing database,
called reflection.
Whichever kind of approach is used, we always start out with a collection
that will be where we place our tables known as the MetaData
object. This object is essentially a facade around a Python dictionary
that stores a series of Table
objects keyed to their string
name. While the ORM provides some options on where to get this collection,
we always have the option to simply make one directly, which looks like:
>>> from sqlalchemy import MetaData
>>> metadata_obj = MetaData()
Once we have a MetaData
object, we can declare some
Table
objects. This tutorial will start with the classic
SQLAlchemy tutorial model, which has a table called user_account
that
stores, for example, the users of a website, and a related table address
,
which stores email addresses associated with rows in the user_account
table. When not using ORM Declarative models at all, we construct each
Table
object directly, typically assigning each to a variable
that will be how we will refer to the table in application code:
>>> from sqlalchemy import Table, Column, Integer, String
>>> user_table = Table(
... "user_account",
... metadata_obj,
... Column("id", Integer, primary_key=True),
... Column("name", String(30)),
... Column("fullname", String),
... )
With the above example, when we wish to write code that refers to the
user_account
table in the database, we will use the user_table
Python variable to refer to it.
Components of Table
¶
We can observe that the Table
construct as written in Python
has a resemblance to a SQL CREATE TABLE statement; starting with the table
name, then listing out each column, where each column has a name and a
datatype. The objects we use above are:
Table
- represents a database table and assigns itself to aMetaData
collection.Column
- represents a column in a database table, and assigns itself to aTable
object. TheColumn
usually includes a string name and a type object. The collection ofColumn
objects in terms of the parentTable
are typically accessed via an associative array located atTable.c
:>>> user_table.c.name Column('name', String(length=30), table=<user_account>) >>> user_table.c.keys() ['id', 'name', 'fullname']
Integer
,String
- these classes represent SQL datatypes and can be passed to aColumn
with or without necessarily being instantiated. Above, we want to give a length of “30” to the “name” column, so we instantiatedString(30)
. But for “id” and “fullname” we did not specify these, so we can send the class itself.
See also
The reference and API documentation for MetaData
,
Table
and Column
is at Describing Databases with MetaData.
The reference documentation for datatypes is at SQL Datatype Objects.
In an upcoming section, we will illustrate one of the fundamental
functions of Table
which
is to generate DDL on a particular database connection. But first
we will declare a second Table
.
Declaring Simple Constraints¶
The first Column
in the example user_table
includes the
Column.primary_key
parameter which is a shorthand technique
of indicating that this Column
should be part of the primary
key for this table. The primary key itself is normally declared implicitly
and is represented by the PrimaryKeyConstraint
construct,
which we can see on the Table.primary_key
attribute on the Table
object:
>>> user_table.primary_key
PrimaryKeyConstraint(Column('id', Integer(), table=<user_account>, primary_key=True, nullable=False))
The constraint that is most typically declared explicitly is the
ForeignKeyConstraint
object that corresponds to a database
foreign key constraint. When we declare tables that are related to
each other, SQLAlchemy uses the presence of these foreign key constraint
declarations not only so that they are emitted within CREATE statements to
the database, but also to assist in constructing SQL expressions.
A ForeignKeyConstraint
that involves only a single column
on the target table is typically declared using a column-level shorthand notation
via the ForeignKey
object. Below we declare a second table
address
that will have a foreign key constraint referring to the user
table:
>>> from sqlalchemy import ForeignKey
>>> address_table = Table(
... "address",
... metadata_obj,
... Column("id", Integer, primary_key=True),
... Column("user_id", ForeignKey("user_account.id"), nullable=False),
... Column("email_address", String, nullable=False),
... )
The table above also features a third kind of constraint, which in SQL is the
“NOT NULL” constraint, indicated above using the Column.nullable
parameter.
Tip
When using the ForeignKey
object within a
Column
definition, we can omit the datatype for that
Column
; it is automatically inferred from that of the
related column, in the above example the Integer
datatype
of the user_account.id
column.
In the next section we will emit the completed DDL for the user
and
address
table to see the completed result.
Emitting DDL to the Database¶
We’ve constructed an object structure that represents
two database tables in a database, starting at the root MetaData
object, then into two Table
objects, each of which hold
onto a collection of Column
and Constraint
objects. This object structure will be at the center of most operations
we perform with both Core and ORM going forward.
The first useful thing we can do with this structure will be to emit CREATE
TABLE statements, or DDL, to our SQLite database so that we can insert
and query data from them. We have already all the tools needed to do so, by
invoking the
MetaData.create_all()
method on our MetaData
,
sending it the Engine
that refers to the target database:
>>> metadata_obj.create_all(engine)
BEGIN (implicit)
PRAGMA main.table_...info("user_account")
...
PRAGMA main.table_...info("address")
...
CREATE TABLE user_account (
id INTEGER NOT NULL,
name VARCHAR(30),
fullname VARCHAR,
PRIMARY KEY (id)
)
...
CREATE TABLE address (
id INTEGER NOT NULL,
user_id INTEGER NOT NULL,
email_address VARCHAR NOT NULL,
PRIMARY KEY (id),
FOREIGN KEY(user_id) REFERENCES user_account (id)
)
...
COMMIT
The DDL create process above includes some SQLite-specific PRAGMA statements that test for the existence of each table before emitting a CREATE. The full series of steps are also included within a BEGIN/COMMIT pair to accommodate for transactional DDL.
The create process also takes care of emitting CREATE statements in the correct
order; above, the FOREIGN KEY constraint is dependent on the user
table
existing, so the address
table is created second. In more complicated
dependency scenarios the FOREIGN KEY constraints may also be applied to tables
after the fact using ALTER.
The MetaData
object also features a
MetaData.drop_all()
method that will emit DROP statements in the
reverse order as it would emit CREATE in order to drop schema elements.
Using ORM Declarative Forms to Define Table Metadata¶
When using the ORM, the process by which we declare Table
metadata
is usually combined with the process of declaring mapped classes.
The mapped class is any Python class we’d like to create, which will then
have attributes on it that will be linked to the columns in a database table.
While there are a few varieties of how this is achieved, the most common
style is known as
declarative, and allows us
to declare our user-defined classes and Table
metadata
at once.
Establishing a Declarative Base¶
When using the ORM, the MetaData
collection remains present,
however it itself is associated with an ORM-only construct commonly referred
towards as the Declarative Base. The most expedient way to acquire
a new Declarative Base is to create a new class that subclasses the
SQLAlchemy DeclarativeBase
class:
>>> from sqlalchemy.orm import DeclarativeBase
>>> class Base(DeclarativeBase):
... pass
Above, the Base
class is what we’ll call the Declarative Base.
When we make new classes that are subclasses of Base
, combined with
appropriate class-level directives, they will each be established as a new
ORM mapped class at class creation time, each one typically (but not
exclusively) referring to a particular Table
object.
The Declarative Base refers to a MetaData
collection that is
created for us automatically, assuming we didn’t provide one from the outside.
This MetaData
collection is accessible via the
DeclarativeBase.metadata
class-level attribute. As we create new
mapped classes, they each will reference a Table
within this
MetaData
collection:
>>> Base.metadata
MetaData()
The Declarative Base also refers to a collection called registry
, which
is the central “mapper configuration” unit in the SQLAlchemy ORM. While
seldom accessed directly, this object is central to the mapper configuration
process, as a set of ORM mapped classes will coordinate with each other via
this registry. As was the case with MetaData
, our Declarative
Base also created a registry
for us (again with options to
pass our own registry
), which we can access
via the DeclarativeBase.registry
class variable:
>>> Base.registry
<sqlalchemy.orm.decl_api.registry object at 0x...>
Declaring Mapped Classes¶
With the Base
class established, we can now define ORM mapped classes
for the user_account
and address
tables in terms of new classes User
and
Address
. We illustrate below the most modern form of Declarative, which
is driven from PEP 484 type annotations using a special type
Mapped
, which indicates attributes to be mapped as particular
types:
>>> from typing import List
>>> from typing import Optional
>>> from sqlalchemy.orm import Mapped
>>> from sqlalchemy.orm import mapped_column
>>> from sqlalchemy.orm import relationship
>>> class User(Base):
... __tablename__ = "user_account"
...
... id: Mapped[int] = mapped_column(primary_key=True)
... name: Mapped[str] = mapped_column(String(30))
... fullname: Mapped[Optional[str]]
...
... addresses: Mapped[List["Address"]] = relationship(back_populates="user")
...
... def __repr__(self) -> str:
... return f"User(id={self.id!r}, name={self.name!r}, fullname={self.fullname!r})"
>>> class Address(Base):
... __tablename__ = "address"
...
... id: Mapped[int] = mapped_column(primary_key=True)
... email_address: Mapped[str]
... user_id = mapped_column(ForeignKey("user_account.id"))
...
... user: Mapped[User] = relationship(back_populates="addresses")
...
... def __repr__(self) -> str:
... return f"Address(id={self.id!r}, email_address={self.email_address!r})"
The two classes above, User
and Address
, are now called
as ORM Mapped Classes, and are available for use in
ORM persistence and query operations, which will be described later. Details
about these classes include:
Each class refers to a
Table
object that was generated as part of the declarative mapping process, which is named by assigning a string to theDeclarativeBase.__tablename__
attribute. Once the class is created, this generatedTable
is available from theDeclarativeBase.__table__
attribute.As mentioned previously, this form is known as Declarative Table Configuration. One of several alternative declaration styles would instead have us build the
Table
object directly, and assign it directly toDeclarativeBase.__table__
. This style is known as Declarative with Imperative Table.To indicate columns in the
Table
, we use themapped_column()
construct, in combination with typing annotations based on theMapped
type. This object will generateColumn
objects that are applied to the construction of theTable
.For columns with simple datatypes and no other options, we can indicate a
Mapped
type annotation alone, using simple Python types likeint
andstr
to meanInteger
andString
. Customization of how Python types are interpreted within the Declarative mapping process is very open ended; see the sections Using Annotated Declarative Table (Type Annotated Forms for mapped_column()) and Customizing the Type Map for background.A column can be declared as “nullable” or “not null” based on the presence of the
Optional[<typ>]
type annotation (or its equivalents,<typ> | None
orUnion[<typ>, None]
). Themapped_column.nullable
parameter may also be used explicitly (and does not have to match the annotation’s optionality).Use of explicit typing annotations is completely optional. We can also use
mapped_column()
without annotations. When using this form, we would use more explicit type objects likeInteger
andString
as well asnullable=False
as needed within eachmapped_column()
construct.Two additional attributes,
User.addresses
andAddress.user
, define a different kind of attribute calledrelationship()
, which features similar annotation-aware configuration styles as shown. Therelationship()
construct is discussed more fully at Working with ORM Related Objects.The classes are automatically given an
__init__()
method if we don’t declare one of our own. The default form of this method accepts all attribute names as optional keyword arguments:>>> sandy = User(name="sandy", fullname="Sandy Cheeks")
To automatically generate a full-featured
__init__()
method which provides for positional arguments as well as arguments with default keyword values, the dataclasses feature introduced at Declarative Dataclass Mapping may be used. It’s of course always an option to use an explicit__init__()
method as well.The
__repr__()
methods are added so that we get a readable string output; there’s no requirement for these methods to be here. As is the case with__init__()
, a__repr__()
method can be generated automatically by using the dataclasses feature.
See also
ORM Mapping Styles - full background on different ORM configurational styles.
Declarative Mapping - overview of Declarative class mapping
Declarative Table with mapped_column() - detail on how to use
mapped_column()
and Mapped
to define the columns
within a Table
to be mapped when using Declarative.
Emitting DDL to the database from an ORM mapping¶
As our ORM mapped classes refer to Table
objects contained
within a MetaData
collection, emitting DDL given the
Declarative Base uses the same process as that described previously at
Emitting DDL to the Database. In our case, we have already generated the
user
and address
tables in our SQLite database. If we had not done so
already, we would be free to make use of the MetaData
associated with our ORM Declarative Base class in order to do so, by accessing
the collection from the DeclarativeBase.metadata
attribute and
then using MetaData.create_all()
as before. In this case,
PRAGMA statements are run, but no new tables are generated since they
are found to be present already:
>>> Base.metadata.create_all(engine)
BEGIN (implicit)
PRAGMA main.table_...info("user_account")
...
PRAGMA main.table_...info("address")
...
COMMIT
Table Reflection¶
To round out the section on working with table metadata, we will illustrate
another operation that was mentioned at the beginning of the section,
that of table reflection. Table reflection refers to the process of
generating Table
and related objects by reading the current
state of a database. Whereas in the previous sections we’ve been declaring
Table
objects in Python, where we then have the option
to emit DDL to the database to generate such a schema, the reflection process
does these two steps in reverse, starting from an existing database
and generating in-Python data structures to represent the schemas within
that database.
Tip
There is no requirement that reflection must be used in order to use SQLAlchemy with a pre-existing database. It is entirely typical that the SQLAlchemy application declares all metadata explicitly in Python, such that its structure corresponds to that the existing database. The metadata structure also need not include tables, columns, or other constraints and constructs in the pre-existing database that are not needed for the local application to function.
As an example of reflection, we will create a new Table
object which represents the some_table
object we created manually in
the earlier sections of this document. There are again some varieties of
how this is performed, however the most basic is to construct a
Table
object, given the name of the table and a
MetaData
collection to which it will belong, then
instead of indicating individual Column
and
Constraint
objects, pass it the target Engine
using the Table.autoload_with
parameter:
>>> some_table = Table("some_table", metadata_obj, autoload_with=engine)
BEGIN (implicit)
PRAGMA main.table_...info("some_table")
[raw sql] ()
SELECT sql FROM (SELECT * FROM sqlite_master UNION ALL SELECT * FROM sqlite_temp_master) WHERE name = ? AND type in ('table', 'view')
[raw sql] ('some_table',)
PRAGMA main.foreign_key_list("some_table")
...
PRAGMA main.index_list("some_table")
...
ROLLBACK
At the end of the process, the some_table
object now contains the
information about the Column
objects present in the table, and
the object is usable in exactly the same way as a Table
that
we declared explicitly:
>>> some_table
Table('some_table', MetaData(),
Column('x', INTEGER(), table=<some_table>),
Column('y', INTEGER(), table=<some_table>),
schema=None)
See also
Read more about table and schema reflection at Reflecting Database Objects.
For ORM-related variants of table reflection, the section Mapping Declaratively with Reflected Tables includes an overview of the available options.
Next Steps¶
We now have a SQLite database ready to go with two tables present, and
Core and ORM table-oriented constructs that we can use to interact with
these tables via a Connection
and/or ORM
Session
. In the following sections, we will illustrate
how to create, manipulate, and select data using these structures.