Skip to content

API Reference

Module-by-module reference for all public exports of effect-dynamodb.

Provides annotations, date schemas, storage modifiers, and model configuration for effect-dynamodb.

import { DynamoModel } from "effect-dynamodb"
ExportTypeDescription
Hidden<S>(schema: S) => SMarks a field as hidden from asModel/asRecord decode. Still stored in DynamoDB, visible in asItem/asNative
isHidden(schema: Schema.Top) => booleanCheck if a schema field has the Hidden annotation
identifier<S>(schema: S) => SMarks the primary business identifier on a model. Exactly one per entity. Required for entities referenced via ref
isIdentifier(schema: Schema.Top) => booleanCheck if a schema field has the identifier annotation
getIdentifierField(model: Schema.Top) => { name, schema } | undefinedFind the identifier field in a model’s .fields
ref<S>(schema: S) => SMarks a field as a denormalized reference to another entity. Transforms types in Entity (Input: ID, Record: full object, Update: optional ID)
isRef(schema: Schema.Top) => booleanCheck if a schema field has the ref annotation
getRefAnnotation(schema: Schema.Top) => RefAnnotation | undefinedGet the full ref annotation metadata

All date schemas carry a DynamoEncoding annotation that controls how the field is stored in DynamoDB.

ExportWire TypeDomain TypeDynamoDB Storage
DateStringISO 8601 stringDateTime.UtcString (S)
DateEpochMsepoch millisecondsDateTime.UtcNumber (N)
DateEpochSecondsepoch secondsDateTime.UtcNumber (N)
DateEpoch(opts)auto-detect ms/secondsDateTime.Utcconfigurable
DateTimeZonedISO+offset+zone stringDateTime.ZonedString (S)
UnsafeDateStringISO 8601 stringnative DateString (S)
UnsafeDateEpochMsepoch millisecondsnative DateNumber (N)
UnsafeDateEpochSecondsepoch secondsnative DateNumber (N)
TTLepoch secondsDateTime.UtcNumber (N) — alias for DateEpochSeconds
ExportTypeDescription
storedAs<A>(storageSchema: Schema<A>) => (fieldSchema: Schema<A>) => Schema<A>Override DynamoDB storage format. Type-safe: domain types must match
ExportTypeDescription
configure(model, attributes) => ConfiguredModel<M>Wrap a model with per-field DynamoDB overrides (field renaming, storage encoding, immutable)
isConfiguredModel(value) => booleanCheck if a value is a ConfiguredModel
ExportTypeDescription
DynamoEncodingKeysymbolAnnotation key for reading DynamoEncoding from schema AST
getEncoding(schema) => DynamoEncoding | undefinedRead the DynamoEncoding annotation from a schema
TypeDescription
DynamoEncoding{ storage: "string" | "epochMs" | "epochSeconds", domain: "DateTime.Utc" | "DateTime.Zoned" | "Date" }
ConfiguredModel<M>Wrapper carrying original model + per-field attribute overrides
RefAnnotation{ _tag: "Ref", refSchemaId?: string } — annotation metadata for ref fields
// Annotations
class Employee extends Schema.Class<Employee>("Employee")({
employeeId: Schema.String,
createdBy: Schema.String,
internalId: Schema.String.pipe(DynamoModel.Hidden),
}) {}
// configure — immutable fields + field overrides
const EmployeeModel = DynamoModel.configure(Employee, {
createdBy: { immutable: true },
})
// Date schemas
class Event extends Schema.Class<Event>("Event")({
eventId: Schema.String,
startedAt: DynamoModel.DateString, // ISO string ↔ DateTime.Utc
expiresAt: DynamoModel.DateEpochSeconds, // epoch seconds (for TTL)
scheduledAt: DynamoModel.DateTimeZoned, // with timezone
}) {}
// storedAs — override storage format
class Order extends Schema.Class<Order>("Order")({
orderId: Schema.String,
// Wire: ISO string, stored in DynamoDB as epoch seconds (for TTL)
expiresAt: DynamoModel.DateString.pipe(DynamoModel.storedAs(DynamoModel.DateEpochSeconds)),
}) {}
// configure — field renaming + storage overrides
const OrderModel = DynamoModel.configure(Order, {
expiresAt: { field: "ttl", storedAs: DynamoModel.DateEpochSeconds },
})
// identifier — marks primary business ID for ref resolution
class Team extends Schema.Class<Team>("Team")({
id: Schema.String.pipe(DynamoModel.identifier),
name: Schema.String,
}) {}
// ref — denormalized reference to another entity
class Selection extends Schema.Class<Selection>("Selection")({
team: Team.pipe(DynamoModel.ref), // Input: teamId, Record: Team
player: Player.pipe(DynamoModel.ref), // Input: playerId, Record: Player
}) {}

Application namespace for key prefixing and versioning.

import { DynamoSchema } from "effect-dynamodb"
ExportTypeDescription
make(config) => DynamoSchemaCreate a schema with name, version, and optional casing
prefix(schema) => stringBuild schema prefix: $name#vN
applyCasing(schema, value) => stringApply casing rules to a structural key part
composeKey(schema, entityType, composites) => stringCompose entity key: $schema#vN#entity_type#composites
composeCollectionKey(schema, collection, composites) => stringCompose collection PK
composeClusteredSortKey(schema, collection, entity, composites) => stringCompose clustered SK
composeIsolatedSortKey(schema, entity, composites) => stringCompose isolated SK
composeUniqueKey(schema, entity, constraint, values) => { pk, sk }Compose unique constraint sentinel keys
composeVersionKey(schema, entity, version) => stringCompose version snapshot SK
composeDeletedKey(schema, entity, timestamp) => stringCompose soft-deleted item SK
composeVersionKeyPrefix(schema, entity) => stringPrefix for version queries
composeDeletedKeyPrefix(schema, entity) => stringPrefix for deleted queries
TypeDescription
DynamoSchemaInterface: { name: string, version: number, casing: Casing }
Casing"lowercase" | "uppercase" | "preserve"
const AppSchema = DynamoSchema.make({ name: "myapp", version: 1 })
// Keys generated as: $myapp#v1#entity_type#composites

See Modeling for details.


Minimal table definition with Layer-based name injection.

import { Table } from "effect-dynamodb"
ExportTypeDescription
make(config: { schema, entities?, aggregates? }) => TableCreate a table definition with entities and aggregates
definition(table) => TableDefinitionDerive CreateTableCommandInput fields from entity index declarations
TypeDescription
TableInterface with schema, entities, aggregates, layer(), layerConfig(), and Tag for DI
TableConfig{ name: string } — runtime table configuration
TableDefinition{ KeySchema, AttributeDefinitions, GlobalSecondaryIndexes? }
const MainTable = Table.make({ schema: AppSchema, entities: { UserEntity, TaskEntity } })
// Runtime name injection via Layer
MainTable.layer({ name: "MyTable" })
// Typed client auto-derives table schema for creation
const db = yield* DynamoClient.make({
entities: { UserEntity, TaskEntity },
tables: { MainTable },
})
yield* db.tables.MainTable.create()

See Modeling for details.


The core module — binds models to tables, provides CRUD operations, query accessors, lifecycle management, and 7 derived types. This is the largest module.

import { Entity } from "effect-dynamodb"
ExportTypeDescription
make(config) => EntityCreate an entity with model, table, indexes, and optional system fields

make config:

PropertyTypeRequiredDescription
modelSchema.Class | Schema.Struct | ConfiguredModelYesDomain model schema (or configured model with field overrides)
entityTypestringYesDiscriminator stored as __edd_e__
primaryKey{ pk: KeyDef, sk: KeyDef }YesPrimary key composition rules
indexesRecord<string, GsiIndexDef>NoGSI index definitions with name, pk, sk, optional collection
timestampsboolean | { created?, updated? }NoAuto-managed createdAt/updatedAt
versionedboolean | { retain?, field?, ttl? }NoAuto-increment version with optional snapshot retention
softDeleteboolean | { ttl?, preserveUnique? }NoSoft delete with optional TTL
uniqueRecord<string, string[]>NoUnique constraint definitions
refsRecord<string, Entity>NoMap ref field names to their source entities for ref hydration

These are methods on the entity definition returned by Entity.make(). They return operation descriptors used by Transaction, Batch, and advanced pipeable workflows. For day-to-day CRUD, use DynamoClient.make(table) to get a typed client with executable methods (see below).

OperationSignatureDescription
entity.put(input)EntityPutInsert or overwrite an item (descriptor)
entity.create(input)EntityPutInsert only — fails with ConditionalCheckFailed if exists (descriptor)
entity.upsert(input)EntityPutCreate or update — uses if_not_exists() for immutable fields, createdAt, version (descriptor)
entity.get(key)EntityGetGet a single item by primary key (descriptor)
entity.update(key)EntityUpdateStart an update operation (compose with set, add, etc.) (descriptor)
entity.patch(key)EntityUpdateUpdate with attribute_exists — fails with ConditionalCheckFailed if not exists (descriptor)
entity.delete(key)EntityDeleteDelete an item (descriptor)
entity.deleteIfExists(key)EntityDeleteDelete with attribute_exists — fails with ConditionalCheckFailed if not exists (descriptor)
entity.query.<indexName>(pk)Query<Record>Query a specific index by partition key (descriptor)
entity.scan()Query<Record>Full table scan filtered by entity type (descriptor)
entity.versions(key)Query<Record>Query version history for an item (requires versioned: { retain: true }) (descriptor)
entity.deleted.list(key)Query<Record>List soft-deleted items for a key (requires softDelete) (descriptor)
entity.batchGet(keys)See Batch moduleBatch get items
entity.batchPut(items)See Batch moduleBatch put items
entity.batchDelete(keys)See Batch moduleBatch delete items

DynamoClient.make({ entities, aggregates?, tables? }) resolves dependencies and returns a typed client with executable operations for all listed entities and aggregates, namespaced under entities, aggregates, collections, and tables:

const MainTable = Table.make({ schema: AppSchema, entities: { UserEntity } })
const db = yield* DynamoClient.make({
entities: { UserEntity },
tables: { MainTable },
})
const users = db.entities.UserEntity

Bound entity methods:

MethodDescription
bound.get(key)Get a single item
bound.put(input, ...combinators)Insert or overwrite. Optional combinators (e.g. condition(...))
bound.create(input, ...combinators)Insert only — fails with ConditionalCheckFailed if exists
bound.upsert(input, ...combinators)Create or update — if_not_exists() for immutable fields, createdAt, version
bound.update(key, ...combinators)Update an item — compose with Entity.set(...), Entity.expectedVersion(...), etc.
bound.patch(key, ...combinators)Update with attribute_exists — fails if item does not exist
bound.delete(key, ...combinators)Delete an item. Optional combinators (e.g. condition(...))
bound.deleteIfExists(key, ...combinators)Delete with attribute_exists — fails if item does not exist
bound.paginate(query, ...combinators)Execute a query and return a lazy Stream<A> of items, automatically paginating
bound.collect(query, ...combinators)Execute a query and collect all pages into Effect<Array<A>>

Bound lifecycle methods (require matching entity config):

MethodRequiresReturnsDescription
bound.getVersion(key, version)versioned: { retain: true }EffectGet a specific version snapshot
bound.versions(key)versioned: { retain: true }BoundQueryList all version snapshots as a fluent BoundQuery — supports .collect(), .fetch(), .paginate(), .count(), .limit(), .reverse(), .startFrom(), .filter(), .select()
bound.restore(key)softDeleteEffectRestore a soft-deleted item
bound.purge(key)AnyEffectPermanently remove item + all versions and sentinels
bound.deleted.get(key)softDeleteEffectGet a specific soft-deleted item
bound.deleted.list(key)softDeleteBoundQueryList all soft-deleted tombstones in the partition as a fluent BoundQuery

Entity definition lifecycle operations (require matching config — listed in Operations table above for query descriptors):

OperationRequiresDescription
entity.getVersion(key, version)versioned: { retain: true }Get a specific version snapshot (descriptor)
entity.deleted.get(key)softDeleteGet a soft-deleted item (descriptor)
entity.restore(key)softDeleteRestore a soft-deleted item (descriptor)
entity.purge(key)AnyDelete all items in the partition (main + versions + deleted) (descriptor)

These functions transform entity operations via pipe:

ExportWorks OnDescription
set(updates)EntityUpdateSet fields to new values (dual API)
expectedVersion(n)EntityUpdateOptimistic lock — fail if version doesn’t match
consistentReadEntityGetEnable strongly consistent reads
condition(input)EntityPut, EntityUpdate, EntityDeleteAdd a ConditionExpression
remove(fields)EntityUpdateREMOVE attributes from the item
add(values)EntityUpdateAtomically ADD to numeric attributes
subtract(values)EntityUpdateSubtract from numeric attributes (SET #f = #f - :v)
append(values)EntityUpdateAppend to list attributes (list_append)
deleteFromSet(values)EntityUpdateDELETE elements from Set attributes
returnValues(mode)EntityUpdate, EntityDeleteControl DynamoDB ReturnValues: "none" | "allOld" | "allNew" | "updatedOld" | "updatedNew"
cascade(config)EntityUpdatePropagate source entity changes to target entities that embed it via DynamoModel.ref. Config: { targets, filter?, mode? } (dual API)

Control what type an operation returns:

ExportReturnsDescription
asModelEntity.Model<E>Pure domain object (default for yield*)
asRecordEntity.Record<E>Domain + system fields (version, timestamps)
asItemEntity.Item<E>Full DynamoDB item (all keys + __edd_e__)
asNativeEntity.Marshalled<E>Raw DynamoDB AttributeValue format
TypeDescription
Entity.Model<E>Pure domain object fields
Entity.Record<E>Model + system metadata (version, timestamps)
Entity.Input<E>Creation input (model fields, no system fields)
Entity.Update<E>Mutable fields only (keys and immutable excluded)
Entity.Key<E>Primary key attributes only
Entity.Item<E>Full DynamoDB item (model + system + keys + __edd_e__)
Entity.Marshalled<E>DynamoDB AttributeValue format
ExportDescription
keyAttributes(entity)List all key attribute names (primary + GSI)
keyFieldNames(entity)List physical field names for all keys
compositeAttributes(entity)List all composite attribute names across indexes
itemSchema(entity)Get the item-level decode schema
decodeMarshalledItem(entity, item)Decode a marshalled DynamoDB item through entity schema
const UserEntity = Entity.make({
model: User,
entityType: "User",
primaryKey: {
pk: { field: "pk", composite: ["userId"] },
sk: { field: "sk", composite: [] },
},
indexes: {
byEmail: {
name: "gsi1",
pk: { field: "gsi1pk", composite: ["email"] },
sk: { field: "gsi1sk", composite: [] },
},
},
timestamps: true,
versioned: true,
unique: { email: ["email"] },
})
// Get typed client with executable operations
const db = yield* DynamoClient.make({
entities: { UserEntity },
tables: { MainTable },
})
const users = db.entities.UserEntity
// CRUD
const user = yield* users.put({ userId: "u-1", email: "a@b.com", ... })
const found = yield* users.get({ userId: "u-1" })
// Update with combinators
yield* users.update({ userId: "u-1" }, Entity.set({ email: "new@b.com" }))
// Update with multiple combinators
yield* users.update(
{ userId: "u-1" },
Entity.set({ email: "new@b.com" }),
Entity.expectedVersion(1),
)
yield* users.delete({ userId: "u-1" })
// Query execution via bound entity
const allUsers = yield* users.collect(UserEntity.query.byEmail({ email: "a@b.com" }))
const stream = users.paginate(UserEntity.scan(), Query.limit(100))
// Or with v2 entity-centric pattern (BoundQuery fluent API):
// const db = yield* DynamoClient.make({ entities: { UserEntity } })
// const allUsers = yield* db.entities.UserEntity.byEmail({ email: "a@b.com" }).collect()

See Getting Started and Modeling for details.


Pipeable Query<A> data type — a lazy, immutable description of a DynamoDB query or scan.

import { Query } from "effect-dynamodb"
ExportTypeDescription
where(conditions)DualAdd sort key conditions (KeyConditionExpression)
limit(n)DualSet max items per page
maxPages(n)DualLimit total number of pages fetched
reverseCombinatorReverse sort order (descending)
consistentReadCombinatorEnable strongly consistent reads
ignoreOwnershipCombinatorSkip __edd_e__ entity type filter — for mixed-table scenarios
startFrom(cursor)DualResume pagination from a previous cursor
select(attrs)DualProject specific attributes (returns Query<Record<string, unknown>>)
filterExpr(expr)DualAdd an Expr ADT filter (from callback API)
selectPaths(paths)DualProject path segments (from callback API)
ExportReturnsDescription
executeEffect<Page<A>>Execute and return a page: { items: Array<A>, cursor: string | null }
collectEffect<Array<A>>Execute, fetch all pages, and flatten into a single array
paginateEffect<Stream<Array<A>>>Execute and return a Stream of pages for lazy pagination
countEffect<number>Execute with SELECT COUNT — returns total matching items (respects maxPages)
asParamsEffect<Record<string, unknown>>Return built DynamoDB command input without executing — useful for debugging

Used in entity-level filter():

OperatorExampleDescription
Equality{ status: "active" }Exact match
ne{ status: { ne: "deleted" } }Not equal
gt{ price: { gt: 30 } }Greater than
gte{ price: { gte: 30 } }Greater than or equal
lt{ price: { lt: 100 } }Less than
lte{ price: { lte: 100 } }Less than or equal
between{ price: { between: [10, 50] } }Inclusive range
beginsWith{ name: { beginsWith: "A" } }String prefix
contains{ name: { contains: "widget" } }Substring match
exists{ email: { exists: true } }Attribute exists
notExists{ email: { notExists: true } }Attribute does not exist
ExportDescription
isQuery(value)Type guard for Query<A>
const db = yield* DynamoClient.make({
entities: { TaskEntity },
tables: { MainTable },
})
const tasks = db.entities.TaskEntity
const items = yield* tasks.collect(
TaskEntity.query.byProject({ projectId: "p-1" }),
Query.where({ beginsWith: activePrefix }),
TaskEntity.filter({ priority: { gt: 3 } }),
Query.reverse,
Query.limit(25),
)

See Queries for details.


The fluent query builder returned by every entity query accessor on a BoundEntity (e.g. db.entities.Tasks.byProject({...})) and by db.entities.Tasks.scan(). BoundQuery<Model, SkRemaining, A> wraps an internal Query<A> with pre-resolved services so all terminals return Effect<..., ..., never>.

import type { BoundQuery } from "effect-dynamodb"

Combinators are immutable — each call returns a new BoundQuery. Terminals execute the query.

MethodDescription
.where((t, ops) => …)Type-safe sort key condition on remaining SK composites. Only available when SK composites have not all been consumed. Consumes SkRemaining (cannot be called twice). Operators: eq, lt, lte, gt, gte, between, beginsWith.
.filter((t, ops) => …)Post-read filter expression via callback. Type-safe attribute paths via t, condition operators via ops.
.filter(shorthand)Post-read filter via shorthand object (e.g. { status: "active" } or { gt: { price: 30 } }).
.select((t) => […paths])Projection expression via callback. Returns BoundQuery<…, Record<string, unknown>>.
.select(["field", …])Projection via attribute name array.
.limit(n)Maximum items per DynamoDB page.
.maxPages(n)Maximum number of DynamoDB pages to fetch.
.reverse()Reverse sort order (ScanIndexForward = false).
.startFrom(cursor)Resume pagination from an opaque cursor returned by .fetch().
.consistentRead()Enable strongly consistent reads.
.ignoreOwnership()Skip the __edd_e__ entity-type filter. Use only when querying a polymorphic GSI shared across entity types and you want every item back.
MethodReturns
.collect()Effect<Array<A>, DynamoClientError | ValidationError, never> — drain all pages into a single array.
.fetch()Effect<Page<A>, DynamoClientError | ValidationError, never> — single page + opaque cursor. Page<A> is { items: Array<A>, cursor: string | undefined }.
.paginate()Stream<A, DynamoClientError | ValidationError, never> — lazy stream that paginates automatically.
.count()Effect<number, DynamoClientError, never> — count-only query (Select: COUNT); items are not returned.

.where() is conditionally available based on the SkRemaining type parameter. When the index’s SK composites have already been fully consumed (e.g., the entity has no SK composites, or all of them were supplied to the query accessor), .where() is not present on the type — calling it is a compile error. Once called, .where() consumes SkRemaining and the resulting BoundQuery no longer exposes .where().

const db = yield* DynamoClient.make({ entities: { Tasks } })
// Sort key condition + filter + limit
const recent = yield* db.entities.Tasks
.byProject({ projectId: "p-1" })
.where((t, { beginsWith }) => beginsWith(t.createdAt, "2026"))
.filter((t, { eq }) => eq(t.status, "active"))
.limit(50)
.collect()
// Single page + cursor
const page = yield* db.entities.Tasks
.byProject({ projectId: "p-1" })
.fetch()
// Lazy stream
yield* db.entities.Tasks
.byProject({ projectId: "p-1" })
.paginate()
.pipe(Stream.runForEach((task) => Console.log(task.title)))
// Count only
const total = yield* db.entities.Tasks.byProject({ projectId: "p-1" }).count()

See Queries and Expressions for details.


Multi-entity queries across a shared index. Collections are auto-discovered from entity indexes that share the same collection property.

When multiple entities define the same collection name on the same GSI, they are automatically grouped into a collection accessible via db.collections.<name>():

// Define entities with shared collection
const Employees = Entity.make({
// ...
indexes: {
byTenant: {
collection: "tenantMembers",
name: "gsi1",
pk: { field: "gsi1pk", composite: ["tenantId"] },
sk: { field: "gsi1sk", composite: ["name"] },
},
},
})
const Tasks = Entity.make({
// ...
indexes: {
byTenant: {
collection: "tenantMembers",
name: "gsi1",
pk: { field: "gsi1pk", composite: ["tenantId"] },
sk: { field: "gsi1sk", composite: ["priority"] },
},
},
})
// Auto-discovered on the typed client
const db = yield* DynamoClient.make({ entities: { Employees, Tasks } })
const result = yield* db.collections.tenantMembers({ tenantId: "t-1" }).collect()
// result: { Employees: Employee[], Tasks: Task[] }

For advanced use, Collection.make() creates collections explicitly:

import { Collection } from "effect-dynamodb"
ExportTypeDescription
make(name, entities) => CollectionCreate a collection from entities sharing an index

See Indexes & Collections for details.


Atomic multi-item operations.

import { Transaction } from "effect-dynamodb"
ExportTypeDescription
transactGet(items) => Effect<Tuple>Atomically get up to 100 items — returns a typed tuple
transactWrite(ops) => Effect<void>Atomically write up to 100 items (puts, deletes, condition checks)
check(get, condition) => ConditionCheckOpCreate a condition-check operation for transactWrite (dual API)
const [user, order] = yield* Transaction.transactGet(
UserEntity.get({ userId: "u-1" }),
OrderEntity.get({ orderId: "o-1" }),
)
yield* Transaction.transactWrite(
UserEntity.put({ userId: "u-1", ... }),
OrderEntity.delete({ orderId: "o-old" }),
Transaction.check(
UserEntity.get({ userId: "u-1" }),
{ attributeExists: "email" },
),
)
TypeDescription
ConditionCheckOpA condition-check operation for inclusion in transactWrite

Batch get and write with auto-chunking and unprocessed item retry.

import { Batch } from "effect-dynamodb"
ExportTypeDescription
get(...gets) => Effect<Tuple>Batch-get up to 100 items with typed tuple return
write(...ops) => Effect<void>Batch-write any number of items (auto-chunks at 25)
// Batch get — typed positional results
const [user1, user2] = yield* Batch.get(
UserEntity.get({ userId: "u-1" }),
UserEntity.get({ userId: "u-2" }),
)
// Batch write — mixed puts and deletes
yield* Batch.write(
UserEntity.put({ userId: "u-1", ... }),
UserEntity.put({ userId: "u-2", ... }),
OrderEntity.delete({ orderId: "o-old" }),
)

Auto-chunking: get chunks at 100 items, write chunks at 25 items. Both retry unprocessed items automatically.


Condition, filter, and update expression builders.

import { Expression } from "effect-dynamodb"
ExportTypeDescription
condition(input: ConditionInput) => ExpressionResultBuild ConditionExpression
filter(input: ConditionInput) => ExpressionResultBuild FilterExpression
update(input: UpdateInput) => ExpressionResultBuild UpdateExpression
OperatorExampleDescription
eq{ eq: { status: "active" } }Equality
ne{ ne: { status: "deleted" } }Not equal
gt, gte, lt, lte{ gt: { price: 0 } }Comparisons
between{ between: { price: [10, 50] } }Inclusive range
beginsWith{ beginsWith: { name: "A" } }String prefix
attributeExists{ attributeExists: "email" }Attribute exists
attributeNotExists{ attributeNotExists: "pk" }Attribute does not exist
PropertyExampleDescription
set{ set: { name: "New" } }SET attribute values
remove{ remove: ["oldField"] }REMOVE attributes
add{ add: { count: 1 } }ADD to numeric/set attributes
delete{ delete: { tags: new Set(["old"]) } }DELETE from set attributes
TypeDescription
ExpressionResult{ expression: string, names: Record, values: Record }
ConditionInputDeclarative condition expression input
UpdateInputDeclarative update expression input
const cond = Expression.condition({
eq: { status: "active" },
gt: { stock: 0 },
})
// cond.expression: "#status = :v0 AND #stock > :v1"

See Expressions Guide for comprehensive reference.

Type-safe expression building with PathBuilder and ConditionOps.

import {
compileExpr, createConditionOps, createPathBuilder,
isExpr, parseShorthand, parseSimpleShorthand,
type Expr, type ConditionOps, type Path, type PathBuilder,
} from "effect-dynamodb"
ExportTypeDescription
createPathBuilder<M>()() => PathBuilder<M, M>Create a path proxy for type-safe attribute access
createConditionOps<M>()() => ConditionOps<M>Create comparison and logical operators
compileExpr(expr, resolveDbName?)(Expr) => CompileResultCompile Expr to DynamoDB expression string
isExpr(u)(unknown) => u is ExprType guard for Expr nodes
parseShorthand(input)(ConditionInput) => ExprConvert ConditionInput to Expr
parseSimpleShorthand(input)(Record) => ExprConvert { key: value } to Expr

Entity-Level Combinators:

CombinatorDescription
Entity.condition(cb | shorthand)Callback or object → condition combinator for put/update/delete
Entity.filter(cb | shorthand)Callback or object → filter combinator for query/scan
Entity.select(cb | attrs)Callback or string array → projection combinator

Path-Based Update Combinators:

CombinatorDescription
Entity.pathSet(op)SET nested path or attribute-to-attribute copy
Entity.pathRemove(segments)REMOVE nested path or array element
Entity.pathAdd(op)ADD to nested numeric/set
Entity.pathSubtract(op)Subtract from nested numeric
Entity.pathAppend(op)Append to nested list
Entity.pathPrepend(op)Prepend to nested list
Entity.pathIfNotExists(op)Set only if attribute doesn’t exist
Entity.pathDelete(op)DELETE from nested set

Types:

TypeDescription
ExprDiscriminated union of 16 expression node types
ConditionOps<Model>Typed comparison/logical operators for callbacks
PathBuilder<Root, Model>Recursive proxy type for attribute path access
Path<Root, Value, Keys>Resolved attribute path with phantom types
SizeOperand<Root>size() operand for path-based size comparisons
CompileResult{ expression: string, names: Record, values: Record }
DeepPick<T, Paths>Type utility for projection return type narrowing

ProjectionExpression builder for selecting specific attributes.

import { Projection } from "effect-dynamodb"
ExportTypeDescription
projection(attrs: string[]) => ProjectionResultBuild ProjectionExpression from attribute names
TypeDescription
ProjectionResult{ expression: string, names: Record<string, string> }
const proj = Projection.projection(["name", "email", "status"])
// proj.expression: "#proj_name, #proj_email, #proj_status"
// proj.names: { "#proj_name": "name", "#proj_email": "email", "#proj_status": "status" }

Composite key composition from index definitions. Used internally by Entity, also available for advanced use cases.

import { KeyComposer } from "effect-dynamodb"
ExportTypeDescription
composePk(schema, entity, index, record) => stringCompose partition key value
composeSk(schema, entity, index, record) => stringCompose sort key value
composeIndexKeys(schema, entity, index, record) => RecordCompose all key attributes for an index
tryComposeIndexKeys(schema, entity, index, record) => Record | undefinedNon-throwing variant for sparse GSIs
composeAllKeys(schema, entity, indexes, record) => RecordCompose keys for all indexes
composeGsiKeysForUpdatePolicyAware(schema, entity, indexes, updates, keyRecord, { removedSet? }) => { sets, removes }Policy-aware GSI key composition for update/append (sparse/preserve per indexPolicy)
composeSortKeyPrefix(schema, entity, index, composites) => stringPartial SK prefix for begins_with queries
extractComposites(keyPart, record) => string[]Extract composite attribute values from a record
tryExtractComposites(keyPart, record) => string[] | undefinedNon-throwing variant
serializeValue(value) => stringSerialize a value for key composition
TypeDescription
KeyPart{ field: string, composite: string[] }
IndexDefinition{ pk: KeyPart, sk: KeyPart, index?, collection?, type?, casing?, indexPolicy? }
IndexPolicyAttr"sparse" | "preserve"
IndexPolicy(item) => Partial<Record<string, IndexPolicyAttr>>
GsiUpdateResult{ sets: Record<string, string>, removes: ReadonlyArray<string> }

Thin wrapper around @aws-sdk/util-dynamodb.

import { Marshaller } from "effect-dynamodb"
ExportTypeDescription
toAttributeMap(record) => Record<string, AttributeValue>Marshall JS object to DynamoDB format
fromAttributeMap(item) => Record<string, unknown>Unmarshall DynamoDB format to JS object
toAttributeValue(value) => AttributeValueMarshall a single value
fromAttributeValue(av) => unknownUnmarshall a single value

Effect service wrapping AWS SDK DynamoDBClient.

import { DynamoClient } from "effect-dynamodb"
ExportTypeDescription
DynamoClientContext.ServiceEffect service class
DynamoClient.layer(config)Layer<DynamoClient>Create live layer with region + optional endpoint/credentials
DynamoClient.layerConfig(config)Layer<DynamoClient, ConfigError>Create live layer from Effect Config providers. Accepts { region, endpoint?, credentials? } as Config.Config<...> values
MethodDescription
createTable(input)Create a DynamoDB table
deleteTable(input)Delete a DynamoDB table
putItem(input)Put a single item
getItem(input)Get a single item
deleteItem(input)Delete a single item
updateItem(input)Update an item with expression
query(input)Query a table or index
scan(input)Scan a table or index
batchGetItem(input)Batch-get up to 100 items
batchWriteItem(input)Batch-write up to 25 items
transactGetItems(input)Transact-get up to 100 items
transactWriteItems(input)Transact-write up to 100 items
// Standard layer
DynamoClient.layer({ region: "us-east-1" })
// DynamoDB Local
DynamoClient.layer({
region: "us-east-1",
endpoint: "http://localhost:8000",
credentials: { accessKeyId: "local", secretAccessKey: "local" },
})
// Config-based (reads from Effect Config — e.g., env vars)
DynamoClient.layerConfig({
region: Config.string("AWS_REGION"),
endpoint: Config.option(Config.string("DYNAMO_ENDPOINT")),
})

Graph-based composite domain model for DynamoDB. Binds a Schema.Class hierarchy to a DAG of underlying entity types sharing a partition key.

import { Aggregate } from "effect-dynamodb"
ExportTypeDescription
makeOverloadedCreate a sub-aggregate or top-level aggregate (see below)
one(name, { entityType }) => OneEdgeCreate a one-to-one edge descriptor
many(name, config) => ManyEdgeCreate a one-to-many edge descriptor
ref(entity) => RefEdgeCreate a ref edge (inline hydration, no separate DynamoDB item)

Sub-aggregate formAggregate.make(Schema, { root, edges }):

Returns a SubAggregate<TSchema> with a .with(config) method for discriminator binding.

const TeamSheetAggregate = Aggregate.make(TeamSheet, {
root: { entityType: "MatchTeam" },
edges: {
coach: Aggregate.one("coach", { entityType: "MatchCoach" }),
players: Aggregate.many("players", { entityType: "MatchPlayer" }),
},
})

Top-level formAggregate.make(Schema, { table, schema, pk, collection, root, refs?, edges }):

Returns an Aggregate<TSchema, TKey> with get, create, update, delete operations.

const MatchAggregate = Aggregate.make(Match, {
table: MainTable,
schema: CricketSchema,
pk: { field: "pk", composite: ["id"] },
collection: { index: "lsi1", name: "match", sk: { field: "lsi1sk", composite: ["name"] } },
root: { entityType: "MatchItem" },
refs: { Team: Teams, Player: Players, Coach: Coaches, Venue: Venues },
edges: {
venue: Aggregate.one("venue", { entityType: "MatchVenue" }),
team1: TeamSheetAggregate.with({ discriminator: { teamNumber: 1 } }),
team2: TeamSheetAggregate.with({ discriminator: { teamNumber: 2 } }),
},
})
OperationSignatureDescription
aggregate.get(key)Effect<Domain, AggregateAssemblyError | DynamoError | ValidationError>Fetch and assemble by partition key
aggregate.create(input)Effect<Domain, AggregateWriteError>Create from input (ref IDs hydrated, sub-aggregate transactions)
aggregate.update(key, fn)Effect<Domain, AggregateWriteError>Fetch → mutate → diff → write changed groups. fn receives UpdateContext with { state, cursor, optic, current }
aggregate.delete(key)Effect<void, AggregateAssemblyError | DynamoError>Remove all items in the partition
aggregate.list(key)Query<Domain>Query aggregates by collection index
TypeDescription
OneEdgeOne-to-one edge: { _tag: "OneEdge", name, entityType }
ManyEdgeOne-to-many edge: { _tag: "ManyEdge", name, entityType, edgeAttributes?, sk? }
RefEdgeInline ref edge: { _tag: "RefEdge", entity } — no separate DynamoDB item
AggregateEdgeUnion: OneEdge | ManyEdge | RefEdge
TypeDescription
SubAggregate<TSchema>Composable sub-aggregate with .with(config) for discriminator binding
BoundSubAggregate<TSchema>Discriminator-bound sub-aggregate, ready to embed in a parent
Aggregate<TSchema, TKey>Top-level aggregate with CRUD operations
UpdateContext<TIso, TClass>Context provided to update mutation: { state: TIso, cursor: Cursor<TIso>, optic: Optic.Iso<TIso, TIso>, current: TClass }
TypeDescription
Aggregate.Type<A>Assembled domain type (e.g., Match)
Aggregate.Key<A>Partition key type
ExportDescription
isOneEdge(edge)Check if edge is OneEdge
isManyEdge(edge)Check if edge is ManyEdge

See Aggregates & Refs for details.


Typed, Effect-native event sourcing on DynamoDB. Provides a Decider model for command-event-state aggregates, an EventStream repository per stream type, optimistic concurrency via stream versioning, and a commandHandler combinator for the read-decide-append cycle.

import { EventStore } from "effect-dynamodb"
ExportDescription
EventStore.makeStream({ table, streamName, events, streamId, metadata? })Create an EventStream<TEvent, TStreamIdFields, TMetadata> bound to a Table. events is an array of Schema.Class event types, streamId.composite lists the stream-id composite fields, optional metadata is a Schema.
EventStore.bind(stream)Resolve DynamoClient and TableConfig from context and return a BoundEventStream whose operations have R = never. Use inside Context.Service make effects.

A Decider<State, Command, Event, E = never> encodes one aggregate’s command-event-state triad:

FieldType
decide(command: Command, state: State) => Effect<ReadonlyArray<Event>, E>
evolve(state: State, event: Event) => State (pure)
initialStateState

EventStream and BoundEventStream expose the same operations. The only difference is the R parameter — EventStream ops require DynamoClient | TableConfig, BoundEventStream ops have R = never.

MethodReturns
append(streamId, events, expectedVersion, options?)Effect<AppendResult<TEvent>, VersionConflict | DynamoClientError | ValidationError | TransactionCancelled, R> — atomically append events with optimistic concurrency. Each event becomes a Put with attribute_not_exists(pk) inside a single TransactWriteItems. Optional options.metadata is validated against the stream’s metadata schema.
read(streamId)Effect<ReadonlyArray<StreamEvent<TEvent>>, DynamoClientError | ValidationError, R> — read all events for a stream in version order.
readFrom(streamId, afterVersion)Effect<ReadonlyArray<StreamEvent<TEvent>>, DynamoClientError | ValidationError, R> — read events strictly after a given version.
currentVersion(streamId)Effect<number, DynamoClientError | ValidationError, R> — current head version (0 if the stream is empty).
query.events(streamId)Query<StreamEvent<TEvent>> — raw query handle for use with Query combinators.

BoundEventStream additionally exposes provide(effect) as an escape hatch to push services into an arbitrary effect.

ExportDescription
EventStore.commandHandler(decider, stream)Build a (streamId, command, options?) => Effect<CommandHandlerResult, …, R> that reads, folds, decides, and appends atomically. Dual API: commandHandler(decider, stream) (data-first) or stream.pipe(EventStore.commandHandler(decider)) (data-last). Works with both EventStream (R = DynamoClient | TableConfig) and BoundEventStream (R = never). On a no-op command (decide returns []) the handler returns the current state and version without writing.
ExportDescription
EventStore.fold(decider, events)Pure: fold a list of StreamEvent<TEvent> through decider.evolve starting from decider.initialState. Dual API.
EventStore.foldFrom(decider, startState, events)Pure: fold from a supplied starting state (e.g. snapshot + delta events). Dual API.
TypeShape
StreamEvent<A>{ streamId, version, eventType, data: A, metadata, timestamp }
AppendResult<A>{ version: number, events: ReadonlyArray<A> }
CommandHandlerResult<State, Event>AppendResult<Event> & { state: State }
EventStream<TEvent, TStreamIdFields, TMetadata>Repository interface returned by makeStream
BoundEventStream<TEvent, TStreamIdFields, TMetadata>Same shape as EventStream with R = never on every operation
import { EventStore } from "effect-dynamodb"
import { Effect, Schema, Context } from "effect"
class MatchStarted extends Schema.Class<MatchStarted>("MatchStarted")({
venue: Schema.String,
}) {}
class InningsCompleted extends Schema.Class<InningsCompleted>("InningsCompleted")({
innings: Schema.Number,
runs: Schema.Number,
}) {}
const MatchEvents = EventStore.makeStream({
table: EventsTable,
streamName: "Match",
events: [MatchStarted, InningsCompleted],
streamId: { composite: ["matchId"] },
})
const decider = {
initialState: { innings: 0, totalRuns: 0 },
decide: (cmd: { _tag: "Start"; venue: string }, _state) =>
Effect.succeed([new MatchStarted({ venue: cmd.venue })]),
evolve: (state, event) => {
if (event._tag === "MatchStarted") return state
return { innings: event.innings, totalRuns: state.totalRuns + event.runs }
},
}
class MatchEventService extends Context.Service<MatchEventService>()(
"@app/MatchEventService",
{
make: Effect.gen(function* () {
const stream = yield* EventStore.bind(MatchEvents)
const handle = EventStore.commandHandler(decider, stream)
return {
start: (matchId: string, venue: string) =>
handle({ matchId }, { _tag: "Start", venue }),
history: (matchId: string) => stream.read({ matchId }),
}
}),
},
) {}

See Event Sourcing for a complete tutorial.


Tagged error types for precise error handling with catchTag.

import {
DynamoError, ThrottlingError, DynamoValidationError,
InternalServerError, ResourceNotFoundError,
ItemNotFound, ConditionalCheckFailed,
ValidationError, TransactionCancelled, TransactionOverflow,
UniqueConstraintViolation, OptimisticLockError,
ItemDeleted, ItemNotDeleted,
RefNotFound, AggregateAssemblyError,
AggregateDecompositionError, AggregateTransactionOverflow,
CascadePartialFailure, VersionConflict,
} from "effect-dynamodb"
ErrorTagDescription
DynamoError"DynamoError"AWS SDK error wrapper (includes operation and cause)
ThrottlingError"ThrottlingError"AWS SDK throttling — request rate exceeded
DynamoValidationError"DynamoValidationError"Malformed DynamoDB request
InternalServerError"InternalServerError"Transient DynamoDB failure
ResourceNotFoundError"ResourceNotFoundError"Table or index does not exist
ItemNotFound"ItemNotFound"getItem returned no item
ConditionalCheckFailed"ConditionalCheckFailed"Condition expression not met (put, update, delete, create)
ValidationError"ValidationError"Schema decode/encode failed
TransactionCancelled"TransactionCancelled"Transaction rejected (includes cancellation reasons)
TransactionOverflow"TransactionOverflow"Transaction exceeds 100-item DynamoDB limit
UniqueConstraintViolation"UniqueConstraintViolation"Unique constraint violated on put/create
OptimisticLockError"OptimisticLockError"Version mismatch on expectedVersion()
ItemDeleted"ItemDeleted"Item is soft-deleted (get returns this instead of the item)
ItemNotDeleted"ItemNotDeleted"Restore called on an item that isn’t soft-deleted
RefNotFound"RefNotFound"Referenced entity not found during ref hydration (entity, field, refEntity, refId)
AggregateAssemblyError"AggregateAssemblyError"Aggregate read path failed — missing items, structural violations, or decode errors (aggregate, reason, key)
AggregateDecompositionError"AggregateDecompositionError"Aggregate write path failed — schema validation or structural error (aggregate, member, reason)
AggregateTransactionOverflow"AggregateTransactionOverflow"Sub-aggregate exceeds 100-item transaction limit (aggregate, subgraph, itemCount, limit)
CascadePartialFailure"CascadePartialFailure"Cascade update partially failed in eventual mode (sourceEntity, sourceId, succeeded, failed, errors)
VersionConflict"VersionConflict"Event store version mismatch
const db = yield* DynamoClient.make({
entities: { UserEntity },
tables: { MainTable },
})
const users = db.entities.UserEntity
const user = yield* users.get({ userId: "u-1" }).pipe(
Effect.catchTag("ItemNotFound", () => Effect.succeed(null)),
Effect.catchTag("DynamoError", (e) =>
Effect.die(`DynamoDB ${e.operation} failed: ${e.cause}`)
),
)