Module-by-module reference for all public exports of effect-dynamodb.
Provides annotations, date schemas, storage modifiers, and model configuration for effect-dynamodb.
import { DynamoModel } from "effect-dynamodb"
Export Type Description Hidden<S>(schema: S) => SMarks a field as hidden from asModel/asRecord decode. Still stored in DynamoDB, visible in asItem/asNative isHidden(schema: Schema.Top) => booleanCheck if a schema field has the Hidden annotation identifier<S>(schema: S) => SMarks the primary business identifier on a model. Exactly one per entity. Required for entities referenced via ref isIdentifier(schema: Schema.Top) => booleanCheck if a schema field has the identifier annotation getIdentifierField(model: Schema.Top) => { name, schema } | undefinedFind the identifier field in a model’s .fields ref<S>(schema: S) => SMarks a field as a denormalized reference to another entity. Transforms types in Entity (Input: ID, Record: full object, Update: optional ID) isRef(schema: Schema.Top) => booleanCheck if a schema field has the ref annotation getRefAnnotation(schema: Schema.Top) => RefAnnotation | undefinedGet the full ref annotation metadata
All date schemas carry a DynamoEncoding annotation that controls how the field is stored in DynamoDB.
Export Wire Type Domain Type DynamoDB Storage DateStringISO 8601 string DateTime.UtcString (S) DateEpochMsepoch milliseconds DateTime.UtcNumber (N) DateEpochSecondsepoch seconds DateTime.UtcNumber (N) DateEpoch(opts)auto-detect ms/seconds DateTime.Utcconfigurable DateTimeZonedISO+offset+zone string DateTime.ZonedString (S) UnsafeDateStringISO 8601 string native Date String (S) UnsafeDateEpochMsepoch milliseconds native Date Number (N) UnsafeDateEpochSecondsepoch seconds native Date Number (N) TTLepoch seconds DateTime.UtcNumber (N) — alias for DateEpochSeconds
Export Type Description storedAs<A>(storageSchema: Schema<A>) => (fieldSchema: Schema<A>) => Schema<A>Override DynamoDB storage format. Type-safe: domain types must match
Export Type Description configure(model, attributes) => ConfiguredModel<M>Wrap a model with per-field DynamoDB overrides (field renaming, storage encoding, immutable) isConfiguredModel(value) => booleanCheck if a value is a ConfiguredModel
Export Type Description DynamoEncodingKeysymbolAnnotation key for reading DynamoEncoding from schema AST getEncoding(schema) => DynamoEncoding | undefinedRead the DynamoEncoding annotation from a schema
Type Description DynamoEncoding{ storage: "string" | "epochMs" | "epochSeconds", domain: "DateTime.Utc" | "DateTime.Zoned" | "Date" }ConfiguredModel<M>Wrapper carrying original model + per-field attribute overrides RefAnnotation{ _tag: "Ref", refSchemaId?: string } — annotation metadata for ref fields
class Employee extends Schema. Class < Employee >( "Employee" )({
employeeId: Schema.String,
createdBy: Schema.String,
internalId: Schema.String. pipe (DynamoModel.Hidden),
// configure — immutable fields + field overrides
const EmployeeModel = DynamoModel. configure (Employee, {
createdBy: { immutable: true },
class Event extends Schema. Class < Event >( "Event" )({
startedAt: DynamoModel.DateString, // ISO string ↔ DateTime.Utc
expiresAt: DynamoModel.DateEpochSeconds, // epoch seconds (for TTL)
scheduledAt: DynamoModel.DateTimeZoned, // with timezone
// storedAs — override storage format
class Order extends Schema. Class < Order >( "Order" )({
// Wire: ISO string, stored in DynamoDB as epoch seconds (for TTL)
expiresAt: DynamoModel.DateString. pipe (DynamoModel. storedAs (DynamoModel.DateEpochSeconds)),
// configure — field renaming + storage overrides
const OrderModel = DynamoModel. configure (Order, {
expiresAt: { field: "ttl" , storedAs: DynamoModel.DateEpochSeconds },
// identifier — marks primary business ID for ref resolution
class Team extends Schema. Class < Team >( "Team" )({
id: Schema.String. pipe (DynamoModel.identifier),
// ref — denormalized reference to another entity
class Selection extends Schema. Class < Selection >( "Selection" )({
team: Team. pipe (DynamoModel.ref), // Input: teamId, Record: Team
player: Player. pipe (DynamoModel.ref), // Input: playerId, Record: Player
Application namespace for key prefixing and versioning.
import { DynamoSchema } from "effect-dynamodb"
Export Type Description make(config) => DynamoSchemaCreate a schema with name, version, and optional casing prefix(schema) => stringBuild schema prefix: $name#vN applyCasing(schema, value) => stringApply casing rules to a structural key part composeKey(schema, entityType, composites) => stringCompose entity key: $schema#vN#entity_type#composites composeCollectionKey(schema, collection, composites) => stringCompose collection PK composeClusteredSortKey(schema, collection, entity, composites) => stringCompose clustered SK composeIsolatedSortKey(schema, entity, composites) => stringCompose isolated SK composeUniqueKey(schema, entity, constraint, values) => { pk, sk }Compose unique constraint sentinel keys composeVersionKey(schema, entity, version) => stringCompose version snapshot SK composeDeletedKey(schema, entity, timestamp) => stringCompose soft-deleted item SK composeVersionKeyPrefix(schema, entity) => stringPrefix for version queries composeDeletedKeyPrefix(schema, entity) => stringPrefix for deleted queries
Type Description DynamoSchemaInterface: { name: string, version: number, casing: Casing } Casing"lowercase" | "uppercase" | "preserve"
const AppSchema = DynamoSchema. make ({ name: "myapp" , version: 1 })
// Keys generated as: $myapp#v1#entity_type#composites
See Modeling for details.
Minimal table definition with Layer-based name injection.
import { Table } from "effect-dynamodb"
Export Type Description make(config: { schema, entities?, aggregates? }) => TableCreate a table definition with entities and aggregates definition(table) => TableDefinitionDerive CreateTableCommandInput fields from entity index declarations
Type Description TableInterface with schema, entities, aggregates, layer(), layerConfig(), and Tag for DI TableConfig{ name: string } — runtime table configurationTableDefinition{ KeySchema, AttributeDefinitions, GlobalSecondaryIndexes? }
const MainTable = Table. make ({ schema: AppSchema, entities: { UserEntity, TaskEntity } })
// Runtime name injection via Layer
MainTable. layer ({ name: "MyTable" })
// Typed client auto-derives table schema for creation
const db = yield* DynamoClient. make ({
entities: { UserEntity, TaskEntity },
yield* db.tables.MainTable. create ()
See Modeling for details.
The core module — binds models to tables, provides CRUD operations, query accessors, lifecycle management, and 7 derived types. This is the largest module.
import { Entity } from "effect-dynamodb"
Export Type Description make(config) => EntityCreate an entity with model, table, indexes, and optional system fields
make config:
Property Type Required Description modelSchema.Class | Schema.Struct | ConfiguredModelYes Domain model schema (or configured model with field overrides) entityTypestringYes Discriminator stored as __edd_e__ primaryKey{ pk: KeyDef, sk: KeyDef }Yes Primary key composition rules indexesRecord<string, GsiIndexDef>No GSI index definitions with name, pk, sk, optional collection timestampsboolean | { created?, updated? }No Auto-managed createdAt/updatedAt versionedboolean | { retain?, field?, ttl? }No Auto-increment version with optional snapshot retention softDeleteboolean | { ttl?, preserveUnique? }No Soft delete with optional TTL uniqueRecord<string, string[]>No Unique constraint definitions refsRecord<string, Entity>No Map ref field names to their source entities for ref hydration
These are methods on the entity definition returned by Entity.make(). They return operation descriptors used by Transaction, Batch, and advanced pipeable workflows. For day-to-day CRUD, use DynamoClient.make(table) to get a typed client with executable methods (see below).
Operation Signature Description entity.put(input)EntityPutInsert or overwrite an item (descriptor) entity.create(input)EntityPutInsert only — fails with ConditionalCheckFailed if exists (descriptor) entity.upsert(input)EntityPutCreate or update — uses if_not_exists() for immutable fields, createdAt, version (descriptor) entity.get(key)EntityGetGet a single item by primary key (descriptor) entity.update(key)EntityUpdateStart an update operation (compose with set, add, etc.) (descriptor) entity.patch(key)EntityUpdateUpdate with attribute_exists — fails with ConditionalCheckFailed if not exists (descriptor) entity.delete(key)EntityDeleteDelete an item (descriptor) entity.deleteIfExists(key)EntityDeleteDelete with attribute_exists — fails with ConditionalCheckFailed if not exists (descriptor) entity.query.<indexName>(pk)Query<Record>Query a specific index by partition key (descriptor) entity.scan()Query<Record>Full table scan filtered by entity type (descriptor) entity.versions(key)Query<Record>Query version history for an item (requires versioned: { retain: true }) (descriptor) entity.deleted.list(key)Query<Record>List soft-deleted items for a key (requires softDelete) (descriptor) entity.batchGet(keys)See Batch module Batch get items entity.batchPut(items)See Batch module Batch put items entity.batchDelete(keys)See Batch module Batch delete items
A note for Effect v3 users
entity.get, entity.put, entity.update, and entity.delete return Yieldable operation descriptors (EntityOp / EntityDelete), not Effects directly. This mirrors the Effect v4 convention where types like Option are also Yieldable-only — yield* op works inside Effect.gen, but op.pipe(Effect.map(...)) will not compile.
// ✅ Works — yieldable in Effect.gen
yield* UserEntity. get ({ userId: "u-1" })
// ❌ Does not compile — descriptor is not an Effect
UserEntity. get ({ userId: "u-1" }). pipe (Effect. map ( /* … */ ))
// ✅ Convert to Effect explicitly when needed
UserEntity. get ({ userId: "u-1" }). asEffect (). pipe (Effect. map ( /* … */ ))
The recommended path is to access entities through DynamoClient.make() — the resulting BoundEntity (db.entities.UserEntity) auto-wraps each operation into a real Effect<A, E, never>, so you get straightforward Effect.map/Effect.flatMap/pipe ergonomics without ever calling .asEffect(). Reach for raw entity descriptors only when (a) you need an alternate decode mode (asRecord/asItem/asNative) or (b) you are composing the descriptor into a Transaction or Batch, both of which consume descriptors directly.
DynamoClient.make({ entities, aggregates?, tables? }) resolves dependencies and returns a typed client with executable operations for all listed entities and aggregates, namespaced under entities, aggregates, collections, and tables:
const MainTable = Table. make ({ schema: AppSchema, entities: { UserEntity } })
const db = yield* DynamoClient. make ({
entities: { UserEntity },
const users = db.entities.UserEntity
Bound entity methods:
Method Description bound.get(key)Get a single item bound.put(input, ...combinators)Insert or overwrite. Optional combinators (e.g. condition(...)) bound.create(input, ...combinators)Insert only — fails with ConditionalCheckFailed if exists bound.upsert(input, ...combinators)Create or update — if_not_exists() for immutable fields, createdAt, version bound.update(key, ...combinators)Update an item — compose with Entity.set(...), Entity.expectedVersion(...), etc. bound.patch(key, ...combinators)Update with attribute_exists — fails if item does not exist bound.delete(key, ...combinators)Delete an item. Optional combinators (e.g. condition(...)) bound.deleteIfExists(key, ...combinators)Delete with attribute_exists — fails if item does not exist bound.paginate(query, ...combinators)Execute a query and return a lazy Stream<A> of items, automatically paginating bound.collect(query, ...combinators)Execute a query and collect all pages into Effect<Array<A>>
Bound lifecycle methods (require matching entity config):
Method Requires Returns Description bound.getVersion(key, version)versioned: { retain: true }EffectGet a specific version snapshot bound.versions(key)versioned: { retain: true }BoundQueryList all version snapshots as a fluent BoundQuery — supports .collect(), .fetch(), .paginate(), .count(), .limit(), .reverse(), .startFrom(), .filter(), .select() bound.restore(key)softDeleteEffectRestore a soft-deleted item bound.purge(key)Any EffectPermanently remove item + all versions and sentinels bound.deleted.get(key)softDeleteEffectGet a specific soft-deleted item bound.deleted.list(key)softDeleteBoundQueryList all soft-deleted tombstones in the partition as a fluent BoundQuery
Entity definition lifecycle operations (require matching config — listed in Operations table above for query descriptors):
Operation Requires Description entity.getVersion(key, version)versioned: { retain: true }Get a specific version snapshot (descriptor) entity.deleted.get(key)softDeleteGet a soft-deleted item (descriptor) entity.restore(key)softDeleteRestore a soft-deleted item (descriptor) entity.purge(key)Any Delete all items in the partition (main + versions + deleted) (descriptor)
These functions transform entity operations via pipe:
Export Works On Description set(updates)EntityUpdateSet fields to new values (dual API) expectedVersion(n)EntityUpdateOptimistic lock — fail if version doesn’t match consistentReadEntityGetEnable strongly consistent reads condition(input)EntityPut, EntityUpdate, EntityDeleteAdd a ConditionExpression remove(fields)EntityUpdateREMOVE attributes from the item add(values)EntityUpdateAtomically ADD to numeric attributes subtract(values)EntityUpdateSubtract from numeric attributes (SET #f = #f - :v) append(values)EntityUpdateAppend to list attributes (list_append) deleteFromSet(values)EntityUpdateDELETE elements from Set attributes returnValues(mode)EntityUpdate, EntityDeleteControl DynamoDB ReturnValues: "none" | "allOld" | "allNew" | "updatedOld" | "updatedNew" cascade(config)EntityUpdatePropagate source entity changes to target entities that embed it via DynamoModel.ref. Config: { targets, filter?, mode? } (dual API)
Control what type an operation returns:
Export Returns Description asModelEntity.Model<E>Pure domain object (default for yield*) asRecordEntity.Record<E>Domain + system fields (version, timestamps) asItemEntity.Item<E>Full DynamoDB item (all keys + __edd_e__) asNativeEntity.Marshalled<E>Raw DynamoDB AttributeValue format
Type Description Entity.Model<E>Pure domain object fields Entity.Record<E>Model + system metadata (version, timestamps) Entity.Input<E>Creation input (model fields, no system fields) Entity.Update<E>Mutable fields only (keys and immutable excluded) Entity.Key<E>Primary key attributes only Entity.Item<E>Full DynamoDB item (model + system + keys + __edd_e__) Entity.Marshalled<E>DynamoDB AttributeValue format
Export Description keyAttributes(entity)List all key attribute names (primary + GSI) keyFieldNames(entity)List physical field names for all keys compositeAttributes(entity)List all composite attribute names across indexes itemSchema(entity)Get the item-level decode schema decodeMarshalledItem(entity, item)Decode a marshalled DynamoDB item through entity schema
const UserEntity = Entity. make ({
pk: { field: "pk" , composite: [ "userId" ] },
sk: { field: "sk" , composite: [] },
pk: { field: "gsi1pk" , composite: [ "email" ] },
sk: { field: "gsi1sk" , composite: [] },
unique: { email: [ "email" ] },
// Get typed client with executable operations
const db = yield* DynamoClient. make ({
entities: { UserEntity },
const users = db.entities.UserEntity
const user = yield* users. put ({ userId: "u-1" , email: "a@b.com" , ... })
const found = yield* users. get ({ userId: "u-1" })
// Update with combinators
yield* users. update ({ userId: "u-1" }, Entity. set ({ email: "new@b.com" }))
// Update with multiple combinators
Entity. set ({ email: "new@b.com" }),
Entity. expectedVersion ( 1 ),
yield* users. delete ({ userId: "u-1" })
// Query execution via bound entity
const allUsers = yield* users. collect (UserEntity.query. byEmail ({ email: "a@b.com" }))
const stream = users. paginate (UserEntity. scan (), Query. limit ( 100 ))
// Or with v2 entity-centric pattern (BoundQuery fluent API):
// const db = yield* DynamoClient.make({ entities: { UserEntity } })
// const allUsers = yield* db.entities.UserEntity.byEmail({ email: "a@b.com" }).collect()
See Getting Started and Modeling for details.
Pipeable Query<A> data type — a lazy, immutable description of a DynamoDB query or scan.
import { Query } from "effect-dynamodb"
Export Type Description where(conditions)Dual Add sort key conditions (KeyConditionExpression) limit(n)Dual Set max items per page maxPages(n)Dual Limit total number of pages fetched reverseCombinator Reverse sort order (descending) consistentReadCombinator Enable strongly consistent reads ignoreOwnershipCombinator Skip __edd_e__ entity type filter — for mixed-table scenarios startFrom(cursor)Dual Resume pagination from a previous cursor select(attrs)Dual Project specific attributes (returns Query<Record<string, unknown>>) filterExpr(expr)Dual Add an Expr ADT filter (from callback API) selectPaths(paths)Dual Project path segments (from callback API)
Export Returns Description executeEffect<Page<A>>Execute and return a page: { items: Array<A>, cursor: string | null } collectEffect<Array<A>>Execute, fetch all pages, and flatten into a single array paginateEffect<Stream<Array<A>>>Execute and return a Stream of pages for lazy pagination countEffect<number>Execute with SELECT COUNT — returns total matching items (respects maxPages) asParamsEffect<Record<string, unknown>>Return built DynamoDB command input without executing — useful for debugging
Used in entity-level filter():
Operator Example Description Equality { status: "active" }Exact match ne{ status: { ne: "deleted" } }Not equal gt{ price: { gt: 30 } }Greater than gte{ price: { gte: 30 } }Greater than or equal lt{ price: { lt: 100 } }Less than lte{ price: { lte: 100 } }Less than or equal between{ price: { between: [10, 50] } }Inclusive range beginsWith{ name: { beginsWith: "A" } }String prefix contains{ name: { contains: "widget" } }Substring match exists{ email: { exists: true } }Attribute exists notExists{ email: { notExists: true } }Attribute does not exist
Export Description isQuery(value)Type guard for Query<A>
const db = yield* DynamoClient. make ({
entities: { TaskEntity },
const tasks = db.entities.TaskEntity
const items = yield* tasks. collect (
TaskEntity.query. byProject ({ projectId: "p-1" }),
Query. where ({ beginsWith: activePrefix }),
TaskEntity. filter ({ priority: { gt: 3 } }),
See Queries for details.
The fluent query builder returned by every entity query accessor on a BoundEntity (e.g. db.entities.Tasks.byProject({...})) and by db.entities.Tasks.scan(). BoundQuery<Model, SkRemaining, A> wraps an internal Query<A> with pre-resolved services so all terminals return Effect<..., ..., never>.
import type { BoundQuery } from "effect-dynamodb"
Combinators are immutable — each call returns a new BoundQuery. Terminals execute the query.
Method Description .where((t, ops) => …)Type-safe sort key condition on remaining SK composites. Only available when SK composites have not all been consumed. Consumes SkRemaining (cannot be called twice). Operators: eq, lt, lte, gt, gte, between, beginsWith. .filter((t, ops) => …)Post-read filter expression via callback. Type-safe attribute paths via t, condition operators via ops. .filter(shorthand)Post-read filter via shorthand object (e.g. { status: "active" } or { gt: { price: 30 } }). .select((t) => […paths])Projection expression via callback. Returns BoundQuery<…, Record<string, unknown>>. .select(["field", …])Projection via attribute name array. .limit(n)Maximum items per DynamoDB page. .maxPages(n)Maximum number of DynamoDB pages to fetch. .reverse()Reverse sort order (ScanIndexForward = false). .startFrom(cursor)Resume pagination from an opaque cursor returned by .fetch(). .consistentRead()Enable strongly consistent reads. .ignoreOwnership()Skip the __edd_e__ entity-type filter. Use only when querying a polymorphic GSI shared across entity types and you want every item back.
Method Returns .collect()Effect<Array<A>, DynamoClientError | ValidationError, never> — drain all pages into a single array..fetch()Effect<Page<A>, DynamoClientError | ValidationError, never> — single page + opaque cursor. Page<A> is { items: Array<A>, cursor: string | undefined }..paginate()Stream<A, DynamoClientError | ValidationError, never> — lazy stream that paginates automatically..count()Effect<number, DynamoClientError, never> — count-only query (Select: COUNT); items are not returned.
.where() is conditionally available based on the SkRemaining type parameter. When the index’s SK composites have already been fully consumed (e.g., the entity has no SK composites, or all of them were supplied to the query accessor), .where() is not present on the type — calling it is a compile error. Once called, .where() consumes SkRemaining and the resulting BoundQuery no longer exposes .where().
const db = yield* DynamoClient. make ({ entities: { Tasks } })
// Sort key condition + filter + limit
const recent = yield* db.entities.Tasks
. byProject ({ projectId: "p-1" })
. where (( t , { beginsWith }) => beginsWith (t.createdAt, "2026" ))
. filter (( t , { eq }) => eq (t.status, "active" ))
const page = yield* db.entities.Tasks
. byProject ({ projectId: "p-1" })
. byProject ({ projectId: "p-1" })
. pipe (Stream. runForEach (( task ) => Console. log (task.title)))
const total = yield* db.entities.Tasks. byProject ({ projectId: "p-1" }). count ()
See Queries and Expressions for details.
Multi-entity queries across a shared index. Collections are auto-discovered from entity indexes that share the same collection property.
When multiple entities define the same collection name on the same GSI, they are automatically grouped into a collection accessible via db.collections.<name>():
// Define entities with shared collection
const Employees = Entity. make ({
collection: "tenantMembers" ,
pk: { field: "gsi1pk" , composite: [ "tenantId" ] },
sk: { field: "gsi1sk" , composite: [ "name" ] },
const Tasks = Entity. make ({
collection: "tenantMembers" ,
pk: { field: "gsi1pk" , composite: [ "tenantId" ] },
sk: { field: "gsi1sk" , composite: [ "priority" ] },
// Auto-discovered on the typed client
const db = yield* DynamoClient. make ({ entities: { Employees, Tasks } })
const result = yield* db.collections. tenantMembers ({ tenantId: "t-1" }). collect ()
// result: { Employees: Employee[], Tasks: Task[] }
For advanced use, Collection.make() creates collections explicitly:
import { Collection } from "effect-dynamodb"
Export Type Description make(name, entities) => CollectionCreate a collection from entities sharing an index
See Indexes & Collections for details.
Atomic multi-item operations.
import { Transaction } from "effect-dynamodb"
Export Type Description transactGet(items) => Effect<Tuple>Atomically get up to 100 items — returns a typed tuple transactWrite(ops) => Effect<void>Atomically write up to 100 items (puts, deletes, condition checks) check(get, condition) => ConditionCheckOpCreate a condition-check operation for transactWrite (dual API)
const [ user , order ] = yield* Transaction. transactGet (
UserEntity. get ({ userId: "u-1" }),
OrderEntity. get ({ orderId: "o-1" }),
yield* Transaction. transactWrite (
UserEntity. put ({ userId: "u-1" , ... }),
OrderEntity. delete ({ orderId: "o-old" }),
UserEntity. get ({ userId: "u-1" }),
{ attributeExists: "email" },
Type Description ConditionCheckOpA condition-check operation for inclusion in transactWrite
Batch get and write with auto-chunking and unprocessed item retry.
import { Batch } from "effect-dynamodb"
Export Type Description get(...gets) => Effect<Tuple>Batch-get up to 100 items with typed tuple return write(...ops) => Effect<void>Batch-write any number of items (auto-chunks at 25)
// Batch get — typed positional results
const [ user1 , user2 ] = yield* Batch. get (
UserEntity. get ({ userId: "u-1" }),
UserEntity. get ({ userId: "u-2" }),
// Batch write — mixed puts and deletes
UserEntity. put ({ userId: "u-1" , ... }),
UserEntity. put ({ userId: "u-2" , ... }),
OrderEntity. delete ({ orderId: "o-old" }),
Auto-chunking: get chunks at 100 items, write chunks at 25 items. Both retry unprocessed items automatically.
Condition, filter, and update expression builders.
import { Expression } from "effect-dynamodb"
Export Type Description condition(input: ConditionInput) => ExpressionResultBuild ConditionExpression filter(input: ConditionInput) => ExpressionResultBuild FilterExpression update(input: UpdateInput) => ExpressionResultBuild UpdateExpression
Operator Example Description eq{ eq: { status: "active" } }Equality ne{ ne: { status: "deleted" } }Not equal gt, gte, lt, lte{ gt: { price: 0 } }Comparisons between{ between: { price: [10, 50] } }Inclusive range beginsWith{ beginsWith: { name: "A" } }String prefix attributeExists{ attributeExists: "email" }Attribute exists attributeNotExists{ attributeNotExists: "pk" }Attribute does not exist
Property Example Description set{ set: { name: "New" } }SET attribute values remove{ remove: ["oldField"] }REMOVE attributes add{ add: { count: 1 } }ADD to numeric/set attributes delete{ delete: { tags: new Set(["old"]) } }DELETE from set attributes
Type Description ExpressionResult{ expression: string, names: Record, values: Record }ConditionInputDeclarative condition expression input UpdateInputDeclarative update expression input
const cond = Expression. condition ({
eq: { status: "active" },
// cond.expression: "#status = :v0 AND #stock > :v1"
See Expressions Guide for comprehensive reference.
Type-safe expression building with PathBuilder and ConditionOps.
compileExpr, createConditionOps, createPathBuilder,
isExpr, parseShorthand, parseSimpleShorthand,
type Expr, type ConditionOps, type Path, type PathBuilder,
Export Type Description createPathBuilder<M>()() => PathBuilder<M, M>Create a path proxy for type-safe attribute access createConditionOps<M>()() => ConditionOps<M>Create comparison and logical operators compileExpr(expr, resolveDbName?)(Expr) => CompileResultCompile Expr to DynamoDB expression string isExpr(u)(unknown) => u is ExprType guard for Expr nodes parseShorthand(input)(ConditionInput) => ExprConvert ConditionInput to Expr parseSimpleShorthand(input)(Record) => ExprConvert { key: value } to Expr
Entity-Level Combinators:
Combinator Description Entity.condition(cb | shorthand)Callback or object → condition combinator for put/update/delete Entity.filter(cb | shorthand)Callback or object → filter combinator for query/scan Entity.select(cb | attrs)Callback or string array → projection combinator
Path-Based Update Combinators:
Combinator Description Entity.pathSet(op)SET nested path or attribute-to-attribute copy Entity.pathRemove(segments)REMOVE nested path or array element Entity.pathAdd(op)ADD to nested numeric/set Entity.pathSubtract(op)Subtract from nested numeric Entity.pathAppend(op)Append to nested list Entity.pathPrepend(op)Prepend to nested list Entity.pathIfNotExists(op)Set only if attribute doesn’t exist Entity.pathDelete(op)DELETE from nested set
Types:
Type Description ExprDiscriminated union of 16 expression node types ConditionOps<Model>Typed comparison/logical operators for callbacks PathBuilder<Root, Model>Recursive proxy type for attribute path access Path<Root, Value, Keys>Resolved attribute path with phantom types SizeOperand<Root>size() operand for path-based size comparisonsCompileResult{ expression: string, names: Record, values: Record }DeepPick<T, Paths>Type utility for projection return type narrowing
ProjectionExpression builder for selecting specific attributes.
import { Projection } from "effect-dynamodb"
Export Type Description projection(attrs: string[]) => ProjectionResultBuild ProjectionExpression from attribute names
Type Description ProjectionResult{ expression: string, names: Record<string, string> }
const proj = Projection. projection ([ "name" , "email" , "status" ])
// proj.expression: "#proj_name, #proj_email, #proj_status"
// proj.names: { "#proj_name": "name", "#proj_email": "email", "#proj_status": "status" }
Composite key composition from index definitions. Used internally by Entity, also available for advanced use cases.
import { KeyComposer } from "effect-dynamodb"
Export Type Description composePk(schema, entity, index, record) => stringCompose partition key value composeSk(schema, entity, index, record) => stringCompose sort key value composeIndexKeys(schema, entity, index, record) => RecordCompose all key attributes for an index tryComposeIndexKeys(schema, entity, index, record) => Record | undefinedNon-throwing variant for sparse GSIs composeAllKeys(schema, entity, indexes, record) => RecordCompose keys for all indexes composeGsiKeysForUpdatePolicyAware(schema, entity, indexes, updates, keyRecord, { removedSet? }) => { sets, removes }Policy-aware GSI key composition for update/append (sparse/preserve per indexPolicy) composeSortKeyPrefix(schema, entity, index, composites) => stringPartial SK prefix for begins_with queries extractComposites(keyPart, record) => string[]Extract composite attribute values from a record tryExtractComposites(keyPart, record) => string[] | undefinedNon-throwing variant serializeValue(value) => stringSerialize a value for key composition
Type Description KeyPart{ field: string, composite: string[] }IndexDefinition{ pk: KeyPart, sk: KeyPart, index?, collection?, type?, casing?, indexPolicy? }IndexPolicyAttr"sparse" | "preserve"IndexPolicy(item) => Partial<Record<string, IndexPolicyAttr>>GsiUpdateResult{ sets: Record<string, string>, removes: ReadonlyArray<string> }
Thin wrapper around @aws-sdk/util-dynamodb.
import { Marshaller } from "effect-dynamodb"
Export Type Description toAttributeMap(record) => Record<string, AttributeValue>Marshall JS object to DynamoDB format fromAttributeMap(item) => Record<string, unknown>Unmarshall DynamoDB format to JS object toAttributeValue(value) => AttributeValueMarshall a single value fromAttributeValue(av) => unknownUnmarshall a single value
Effect service wrapping AWS SDK DynamoDBClient.
import { DynamoClient } from "effect-dynamodb"
Export Type Description DynamoClientContext.ServiceEffect service class DynamoClient.layer(config)Layer<DynamoClient>Create live layer with region + optional endpoint/credentials DynamoClient.layerConfig(config)Layer<DynamoClient, ConfigError>Create live layer from Effect Config providers. Accepts { region, endpoint?, credentials? } as Config.Config<...> values
Method Description createTable(input)Create a DynamoDB table deleteTable(input)Delete a DynamoDB table putItem(input)Put a single item getItem(input)Get a single item deleteItem(input)Delete a single item updateItem(input)Update an item with expression query(input)Query a table or index scan(input)Scan a table or index batchGetItem(input)Batch-get up to 100 items batchWriteItem(input)Batch-write up to 25 items transactGetItems(input)Transact-get up to 100 items transactWriteItems(input)Transact-write up to 100 items
DynamoClient. layer ({ region: "us-east-1" })
endpoint: "http://localhost:8000" ,
credentials: { accessKeyId: "local" , secretAccessKey: "local" },
// Config-based (reads from Effect Config — e.g., env vars)
DynamoClient. layerConfig ({
region: Config. string ( "AWS_REGION" ),
endpoint: Config. option (Config. string ( "DYNAMO_ENDPOINT" )),
Graph-based composite domain model for DynamoDB. Binds a Schema.Class hierarchy to a DAG of underlying entity types sharing a partition key.
import { Aggregate } from "effect-dynamodb"
Export Type Description makeOverloaded Create a sub-aggregate or top-level aggregate (see below) one(name, { entityType }) => OneEdgeCreate a one-to-one edge descriptor many(name, config) => ManyEdgeCreate a one-to-many edge descriptor ref(entity) => RefEdgeCreate a ref edge (inline hydration, no separate DynamoDB item)
Sub-aggregate form — Aggregate.make(Schema, { root, edges }):
Returns a SubAggregate<TSchema> with a .with(config) method for discriminator binding.
const TeamSheetAggregate = Aggregate. make (TeamSheet, {
root: { entityType: "MatchTeam" },
coach: Aggregate. one ( "coach" , { entityType: "MatchCoach" }),
players: Aggregate. many ( "players" , { entityType: "MatchPlayer" }),
Top-level form — Aggregate.make(Schema, { table, schema, pk, collection, root, refs?, edges }):
Returns an Aggregate<TSchema, TKey> with get, create, update, delete operations.
const MatchAggregate = Aggregate. make (Match, {
pk: { field: "pk" , composite: [ "id" ] },
collection: { index: "lsi1" , name: "match" , sk: { field: "lsi1sk" , composite: [ "name" ] } },
root: { entityType: "MatchItem" },
refs: { Team: Teams, Player: Players, Coach: Coaches, Venue: Venues },
venue: Aggregate. one ( "venue" , { entityType: "MatchVenue" }),
team1: TeamSheetAggregate. with ({ discriminator: { teamNumber: 1 } }),
team2: TeamSheetAggregate. with ({ discriminator: { teamNumber: 2 } }),
Operation Signature Description aggregate.get(key)Effect<Domain, AggregateAssemblyError | DynamoError | ValidationError>Fetch and assemble by partition key aggregate.create(input)Effect<Domain, AggregateWriteError>Create from input (ref IDs hydrated, sub-aggregate transactions) aggregate.update(key, fn)Effect<Domain, AggregateWriteError>Fetch → mutate → diff → write changed groups. fn receives UpdateContext with { state, cursor, optic, current } aggregate.delete(key)Effect<void, AggregateAssemblyError | DynamoError>Remove all items in the partition aggregate.list(key)Query<Domain>Query aggregates by collection index
Type Description OneEdgeOne-to-one edge: { _tag: "OneEdge", name, entityType } ManyEdgeOne-to-many edge: { _tag: "ManyEdge", name, entityType, edgeAttributes?, sk? } RefEdgeInline ref edge: { _tag: "RefEdge", entity } — no separate DynamoDB item AggregateEdgeUnion: OneEdge | ManyEdge | RefEdge
Type Description SubAggregate<TSchema>Composable sub-aggregate with .with(config) for discriminator binding BoundSubAggregate<TSchema>Discriminator-bound sub-aggregate, ready to embed in a parent Aggregate<TSchema, TKey>Top-level aggregate with CRUD operations UpdateContext<TIso, TClass>Context provided to update mutation: { state: TIso, cursor: Cursor<TIso>, optic: Optic.Iso<TIso, TIso>, current: TClass }
Type Description Aggregate.Type<A>Assembled domain type (e.g., Match) Aggregate.Key<A>Partition key type
Export Description isOneEdge(edge)Check if edge is OneEdge isManyEdge(edge)Check if edge is ManyEdge
See Aggregates & Refs for details.
Typed, Effect-native event sourcing on DynamoDB. Provides a Decider model for command-event-state aggregates, an EventStream repository per stream type, optimistic concurrency via stream versioning, and a commandHandler combinator for the read-decide-append cycle.
import { EventStore } from "effect-dynamodb"
Export Description EventStore.makeStream({ table, streamName, events, streamId, metadata? })Create an EventStream<TEvent, TStreamIdFields, TMetadata> bound to a Table. events is an array of Schema.Class event types, streamId.composite lists the stream-id composite fields, optional metadata is a Schema. EventStore.bind(stream)Resolve DynamoClient and TableConfig from context and return a BoundEventStream whose operations have R = never. Use inside Context.Service make effects.
A Decider<State, Command, Event, E = never> encodes one aggregate’s command-event-state triad:
Field Type decide(command: Command, state: State) => Effect<ReadonlyArray<Event>, E>evolve(state: State, event: Event) => State (pure)initialStateState
EventStream and BoundEventStream expose the same operations. The only difference is the R parameter — EventStream ops require DynamoClient | TableConfig, BoundEventStream ops have R = never.
Method Returns append(streamId, events, expectedVersion, options?)Effect<AppendResult<TEvent>, VersionConflict | DynamoClientError | ValidationError | TransactionCancelled, R> — atomically append events with optimistic concurrency. Each event becomes a Put with attribute_not_exists(pk) inside a single TransactWriteItems. Optional options.metadata is validated against the stream’s metadata schema.read(streamId)Effect<ReadonlyArray<StreamEvent<TEvent>>, DynamoClientError | ValidationError, R> — read all events for a stream in version order.readFrom(streamId, afterVersion)Effect<ReadonlyArray<StreamEvent<TEvent>>, DynamoClientError | ValidationError, R> — read events strictly after a given version.currentVersion(streamId)Effect<number, DynamoClientError | ValidationError, R> — current head version (0 if the stream is empty).query.events(streamId)Query<StreamEvent<TEvent>> — raw query handle for use with Query combinators.
BoundEventStream additionally exposes provide(effect) as an escape hatch to push services into an arbitrary effect.
Export Description EventStore.commandHandler(decider, stream)Build a (streamId, command, options?) => Effect<CommandHandlerResult, …, R> that reads, folds, decides, and appends atomically. Dual API: commandHandler(decider, stream) (data-first) or stream.pipe(EventStore.commandHandler(decider)) (data-last). Works with both EventStream (R = DynamoClient | TableConfig) and BoundEventStream (R = never). On a no-op command (decide returns []) the handler returns the current state and version without writing.
Export Description EventStore.fold(decider, events)Pure: fold a list of StreamEvent<TEvent> through decider.evolve starting from decider.initialState. Dual API. EventStore.foldFrom(decider, startState, events)Pure: fold from a supplied starting state (e.g. snapshot + delta events). Dual API.
Type Shape StreamEvent<A>{ streamId, version, eventType, data: A, metadata, timestamp }AppendResult<A>{ version: number, events: ReadonlyArray<A> }CommandHandlerResult<State, Event>AppendResult<Event> & { state: State }EventStream<TEvent, TStreamIdFields, TMetadata>Repository interface returned by makeStream BoundEventStream<TEvent, TStreamIdFields, TMetadata>Same shape as EventStream with R = never on every operation
import { EventStore } from "effect-dynamodb"
import { Effect, Schema, Context } from "effect"
class MatchStarted extends Schema. Class < MatchStarted >( "MatchStarted" )({
class InningsCompleted extends Schema. Class < InningsCompleted >( "InningsCompleted" )({
const MatchEvents = EventStore. makeStream ({
events: [MatchStarted, InningsCompleted],
streamId: { composite: [ "matchId" ] },
initialState: { innings: 0 , totalRuns: 0 },
decide : ( cmd : { _tag : "Start" ; venue : string }, _state ) =>
Effect. succeed ([ new MatchStarted ({ venue: cmd.venue })]),
evolve : ( state , event ) => {
if (event._tag === "MatchStarted" ) return state
return { innings: event.innings, totalRuns: state.totalRuns + event.runs }
class MatchEventService extends Context. Service < MatchEventService >()(
"@app/MatchEventService" ,
make: Effect. gen ( function* () {
const stream = yield* EventStore. bind (MatchEvents)
const handle = EventStore. commandHandler (decider, stream)
start : ( matchId : string , venue : string ) =>
handle ({ matchId }, { _tag: "Start" , venue }),
history : ( matchId : string ) => stream. read ({ matchId }),
See Event Sourcing for a complete tutorial.
Tagged error types for precise error handling with catchTag.
DynamoError, ThrottlingError, DynamoValidationError,
InternalServerError, ResourceNotFoundError,
ItemNotFound, ConditionalCheckFailed,
ValidationError, TransactionCancelled, TransactionOverflow,
UniqueConstraintViolation, OptimisticLockError,
ItemDeleted, ItemNotDeleted,
RefNotFound, AggregateAssemblyError,
AggregateDecompositionError, AggregateTransactionOverflow,
CascadePartialFailure, VersionConflict,
Error Tag Description DynamoError"DynamoError"AWS SDK error wrapper (includes operation and cause) ThrottlingError"ThrottlingError"AWS SDK throttling — request rate exceeded DynamoValidationError"DynamoValidationError"Malformed DynamoDB request InternalServerError"InternalServerError"Transient DynamoDB failure ResourceNotFoundError"ResourceNotFoundError"Table or index does not exist ItemNotFound"ItemNotFound"getItem returned no itemConditionalCheckFailed"ConditionalCheckFailed"Condition expression not met (put, update, delete, create) ValidationError"ValidationError"Schema decode/encode failed TransactionCancelled"TransactionCancelled"Transaction rejected (includes cancellation reasons) TransactionOverflow"TransactionOverflow"Transaction exceeds 100-item DynamoDB limit UniqueConstraintViolation"UniqueConstraintViolation"Unique constraint violated on put/create OptimisticLockError"OptimisticLockError"Version mismatch on expectedVersion() ItemDeleted"ItemDeleted"Item is soft-deleted (get returns this instead of the item) ItemNotDeleted"ItemNotDeleted"Restore called on an item that isn’t soft-deleted RefNotFound"RefNotFound"Referenced entity not found during ref hydration (entity, field, refEntity, refId) AggregateAssemblyError"AggregateAssemblyError"Aggregate read path failed — missing items, structural violations, or decode errors (aggregate, reason, key) AggregateDecompositionError"AggregateDecompositionError"Aggregate write path failed — schema validation or structural error (aggregate, member, reason) AggregateTransactionOverflow"AggregateTransactionOverflow"Sub-aggregate exceeds 100-item transaction limit (aggregate, subgraph, itemCount, limit) CascadePartialFailure"CascadePartialFailure"Cascade update partially failed in eventual mode (sourceEntity, sourceId, succeeded, failed, errors) VersionConflict"VersionConflict"Event store version mismatch
const db = yield* DynamoClient. make ({
entities: { UserEntity },
const users = db.entities.UserEntity
const user = yield* users. get ({ userId: "u-1" }). pipe (
Effect. catchTag ( "ItemNotFound" , () => Effect. succeed ( null )),
Effect. catchTag ( "DynamoError" , ( e ) =>
Effect. die ( `DynamoDB ${ e . operation } failed: ${ e . cause }` )