Skip to content

Commit ed8b152

Browse files
committed
claude22
1 parent 0149faa commit ed8b152

2 files changed

Lines changed: 11 additions & 5 deletions

File tree

CLAUDE.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -176,7 +176,7 @@ tags: Annotated[list, ColumnInfo(type="JSONB", serialize=json.dumps, deserialize
176176

177177
### Bulk Insert Modes
178178
- `unnest` (default): Most efficient for PostgreSQL using UNNEST
179-
- `array_safe`: Uses VALUES syntax with chunking for large datasets
179+
- `array_safe`: Uses VALUES syntax; required when model has array columns (PostgreSQL doesn't support arrays-of-arrays)
180180
- `executemany`: Uses asyncpg's executemany (slowest but most compatible)
181181

182182
## Version Info

sql_athame/dataclasses.py

Lines changed: 10 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -954,8 +954,9 @@ def insert_multiple_sql(cls: type[T], rows: Iterable[T]) -> Fragment:
954954
def insert_multiple_array_safe_sql(cls: type[T], rows: Iterable[T]) -> Fragment:
955955
"""Generate bulk INSERT SQL using VALUES syntax.
956956
957-
This method is safer for very large datasets as it doesn't create
958-
large arrays that might exceed PostgreSQL limits.
957+
This method is required when your model contains array columns, because
958+
PostgreSQL doesn't support arrays-of-arrays (which UNNEST would require).
959+
Use this instead of the UNNEST method when you have array-typed fields.
959960
960961
Args:
961962
rows: Model instances to insert
@@ -1043,7 +1044,9 @@ async def insert_multiple_array_safe(
10431044
) -> str:
10441045
"""Insert multiple records using VALUES syntax with chunking.
10451046
1046-
This method chunks large datasets to avoid PostgreSQL array size limits.
1047+
This method is required when your model contains array columns, because
1048+
PostgreSQL doesn't support arrays-of-arrays (which UNNEST would require).
1049+
Data is processed in chunks to manage memory usage.
10471050
10481051
Args:
10491052
connection_or_pool: Database connection or pool
@@ -1075,7 +1078,7 @@ async def insert_multiple(
10751078
Note:
10761079
The actual method used depends on the insert_multiple_mode setting:
10771080
- 'unnest': Most efficient, uses UNNEST (default)
1078-
- 'array_safe': Uses VALUES with chunking for large datasets
1081+
- 'array_safe': Uses VALUES syntax; required when model has array columns
10791082
- 'executemany': Uses asyncpg's executemany, slowest but most compatible
10801083
"""
10811084
if cls.insert_multiple_mode == "executemany":
@@ -1137,6 +1140,9 @@ async def upsert_multiple_array_safe(
11371140
) -> str:
11381141
"""Bulk upsert using VALUES syntax with chunking.
11391142
1143+
This method is required when your model contains array columns, because
1144+
PostgreSQL doesn't support arrays-of-arrays (which UNNEST would require).
1145+
11401146
Args:
11411147
connection_or_pool: Database connection or pool
11421148
rows: Model instances to upsert

0 commit comments

Comments
 (0)