Skip to main content

Glottolog languoid tree as SQLite database

Project description

Latest PyPI Version License Supported Python Versions Wheel format

Build

This tool loads the content of the languoids/tree directory from the Glottolog master repo into a normalized SQLite database.

Each file under in that directory contains the definition of one Glottolog languoid. Loading their content into a relational database allows to perform some advanced consistency checks (example) and in general to execute queries that inspect the languoid tree relations in a compact and performant way (e.g. without repeatedly traversing the directory tree).

See pyglottolog for the more general official Python API to work with the repo without a mandatory initial loading step (also provides programmatic access to the references and a convenient command-line interface).

The database can be exported into a ZIP file containing one CSV file for each database table, or written into a single denormalized CSV file with one row per languoid (via a provided SQL query).

As sqlite is the most widely used database, the database file itself (e.g. treedb.sqlite3) can be queried directly from most programming environments. It can also be examined using graphical interfaces such as DBeaver, or via the sqlite3 cli.

Python users can also use the provided SQLAlchemy models to build queries or additional abstractions programmatically using SQLAlchemy core or the ORM (as more maintainable alternative to hand-written SQL queries).

Quickstart

Install treedb (and dependencies):

$ pip install treedb

Clone the Glottolog master repo :

$ git clone https://github.com/glottolog/glottolog.git

Note: treedb expects to find it under ./glottolog/ by default (i.e. under the current directory), use treedb.set_root() to point it to a different path.

Load ./glottolog/languoids/tree/**/md.ini into an in-memory sqlite3 database. Write the denormalized example query into treedb.query.csv:

$ python -c "import treedb; treedb.load(); treedb.write_csv()"

Usage from Python

Start a Python shell:

$ python

Import the package:

>>> import treedb

Use treedb.iterlanguoids() to iterate over languoids as (<path>, dict) pairs:

>>> next(treedb.iterlanguoids())
(('abin1243',), {'id': 'abin1243', 'parent_id': None, 'level': 'language', ...

Note: This is a low-level interface, which does not require loading.

Load the database into treedb.sqlite3 (and set the default engine):

>>> treedb.load('treedb.sqlite3')
...
<treedb._proxies.SqliteEngineProxy filename='treedb.sqlite3' ...>

Run consistency checks:

>>> treedb.check()
...
True

Export into a ZIP file containing one CSV file per database table:

>>> treedb.csv_zipfile()
...Path('treedb.zip')

Execute the example query and write it into a CSV file with one row per languoid:

>>> treedb.write_csv()
...Path('treedb.query.csv')

Rebuild the database (e.g. after an update):

>>> treedb.load(rebuild=True)
...
<treedb._proxies.SqliteEngineProxy filename='treedb.sqlite3' ...>

Execute a simple query with sqlalchemy core and write it to a CSV file:

>>> import sqlalchemy as sa
>>> treedb.write_csv(sa.select(treedb.Languoid), filename='languoids.csv')
...Path('languoids.csv')

Get one row from the languoid table via sqlalchemy core (in Glottocode order):

>>> next(treedb.iterrows(sa.select(treedb.Languoid)))
('3adt1234', '3Ad-Tekles', 'dialect', 'nort3292', None, None, None, None)

Get one Languoid model instance via sqlalchemy orm (in Glottocode order):

>>> session = treedb.Session()
>>> session.query(treedb.Languoid).first()
<Languoid id='3adt1234' level='dialect' name='3Ad-Tekles'>
>>> session.close()

See also

License

This tool is distributed under the MIT license.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

treedb-2.6.3.zip (530.0 kB view hashes)

Uploaded Source

Built Distribution

treedb-2.6.3-py3-none-any.whl (70.3 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page