aws_s3

aws_s3 postgres extension to import/export data from/to s3

Overview

PackageVersionCategoryLicenseLanguage
aws_s30.0.1FDWApache-2.0SQL
IDExtensionBinLibLoadCreateTrustRelocSchema
8800aws_s3NoNoNoYesNoYes-
Relatedpg_parquet hdfs_fdw file_fdw duckdb_fdw wrappers pg_bulkload columnar pg_analytics

Version

TypeRepoVersionPG VerPackageDeps
EXTPIGSTY0.0.11817161514aws_s3-
RPMPIGSTY0.0.11817161514aws_s3_$v-
DEBPIGSTY0.0.11817161514postgresql-$v-aws-s3-
OS / PGPG18PG17PG16PG15PG14
el8.x86_64
el8.aarch64
el9.x86_64
el9.aarch64
el10.x86_64
el10.aarch64
d12.x86_64
d12.aarch64
d13.x86_64
d13.aarch64
PIGSTY 0.0.1
PIGSTY 0.0.1
PIGSTY 0.0.1
PIGSTY 0.0.1
PIGSTY 0.0.1
u22.x86_64
u22.aarch64
PIGSTY 0.0.1
PIGSTY 0.0.1
PIGSTY 0.0.1
PIGSTY 0.0.1
PIGSTY 0.0.1
u24.x86_64
u24.aarch64
PIGSTY 0.0.1
PIGSTY 0.0.1
PIGSTY 0.0.1
PIGSTY 0.0.1
PIGSTY 0.0.1

Build

You can build the RPM / DEB packages for aws_s3 using pig build:

pig build pkg aws_s3         # build RPM / DEB packages

Install

You can install aws_s3 directly. First, make sure the PGDG and PIGSTY repositories are added and enabled:

pig repo add pgsql -u          # Add repo and update cache

Install the extension using pig or apt/yum/dnf:

pig install aws_s3;          # Install for current active PG version
pig ext install -y aws_s3 -v 18  # PG 18
pig ext install -y aws_s3 -v 17  # PG 17
pig ext install -y aws_s3 -v 16  # PG 16
pig ext install -y aws_s3 -v 15  # PG 15
pig ext install -y aws_s3 -v 14  # PG 14
dnf install -y aws_s3_18       # PG 18
dnf install -y aws_s3_17       # PG 17
dnf install -y aws_s3_16       # PG 16
dnf install -y aws_s3_15       # PG 15
dnf install -y aws_s3_14       # PG 14
apt install -y postgresql-18-aws-s3   # PG 18
apt install -y postgresql-17-aws-s3   # PG 17
apt install -y postgresql-16-aws-s3   # PG 16
apt install -y postgresql-15-aws-s3   # PG 15
apt install -y postgresql-14-aws-s3   # PG 14

Create Extension:

CREATE EXTENSION aws_s3;

Usage

aws_s3: PostgreSQL extension to import/export data from/to S3

Setup Credentials

Configure AWS credentials via PostgreSQL session variables:

SET aws_s3.access_key_id TO 'your_access_key';
SET aws_s3.secret_key TO 'your_secret_key';
SET aws_s3.session_token TO 'optional_session_token';  -- if using temporary credentials

For local development with LocalStack:

SET aws_s3.endpoint_url TO 'http://localhost:4566';

Import Data from S3

CREATE EXTENSION aws_s3;

CREATE TABLE animals (
  name text,
  age int
);

SELECT aws_s3.table_import_from_s3(
  'animals',
  '',
  '(FORMAT CSV, DELIMITER '','', HEADER true)',
  'my-bucket',
  'animals.csv',
  'us-east-1'
);

SELECT * FROM animals;

Parameters: table name, column list (empty string for all), COPY options, S3 bucket, S3 key, AWS region.

Export Data to S3

SELECT * FROM aws_s3.query_export_to_s3(
  'SELECT * FROM animals',
  'my-bucket',
  'export/animals.csv',
  'us-east-1',
  options := 'FORMAT CSV, HEADER true'
);

Parameters: SQL query, S3 bucket, S3 key, AWS region, COPY options.

Features

  • Files with Content-Encoding=gzip metadata are automatically decompressed during import
  • Credentials can be passed directly as function arguments or via session variables
  • Uses the same COPY format options as PostgreSQL (CSV, TEXT, BINARY, with all related settings like DELIMITER, HEADER, NULL, etc.)

Last Modified 2026-03-12: add pg extension catalog (95749bf)