Skip to content

Commit 30d8a20

Browse files
Abel Milashclaude
andcommitted
Merge main and keep both TestAttributePayload and TestBuildUpsertMultiple classes
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2 parents 6c92164 + 5cd086c commit 30d8a20

24 files changed

Lines changed: 6381 additions & 217 deletions

File tree

.claude/skills/dataverse-sdk-dev/SKILL.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -13,7 +13,7 @@ This skill provides guidance for developers working on the PowerPlatform Dataver
1313

1414
### API Design
1515

16-
1. **Public methods in operation namespaces** - New public methods go in the appropriate namespace module under `src/PowerPlatform/Dataverse/operations/` (`records.py`, `query.py`, `tables.py`). The `client.py` file exposes these via namespace properties (`client.records`, `client.query`, `client.tables`). Public types and constants live in their own modules (e.g., `models/metadata.py`, `common/constants.py`)
16+
1. **Public methods in operation namespaces** - New public methods go in the appropriate namespace module under `src/PowerPlatform/Dataverse/operations/` (`records.py`, `query.py`, `tables.py`, `batch.py`). The `client.py` file exposes these via namespace properties (`client.records`, `client.query`, `client.tables`, `client.batch`). Public types and constants live in their own modules (e.g., `models/metadata.py`, `models/batch.py`, `common/constants.py`)
1717
2. **Every public method needs README example** - Public API methods must have examples in README.md
1818
3. **Reuse existing APIs** - Always check if an existing method can be used before making direct Web API calls
1919
4. **Update documentation** when adding features - Keep README and SKILL files (both copies) in sync

.claude/skills/dataverse-sdk-use/SKILL.md

Lines changed: 45 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -22,6 +22,7 @@ Use the PowerPlatform Dataverse Client Python SDK to interact with Microsoft Dat
2222
- `client.query` -- query and search operations
2323
- `client.tables` -- table metadata, columns, and relationships
2424
- `client.files` -- file upload operations
25+
- `client.batch` -- batch multiple operations into a single HTTP request
2526

2627
### Bulk Operations
2728
The SDK supports Dataverse's native bulk operations: Pass lists to `create()`, `update()` for automatic bulk processing, for `delete()`, set `use_bulk_delete` when passing lists to use bulk operation
@@ -370,6 +371,50 @@ client.files.upload(
370371
)
371372
```
372373

374+
### Batch Operations
375+
376+
Use `client.batch` to send multiple operations in one HTTP request. All batch methods return `None`; results arrive via `BatchResult` after `execute()`.
377+
378+
```python
379+
# Build a batch request
380+
batch = client.batch.new()
381+
batch.records.create("account", {"name": "Contoso"})
382+
batch.records.update("account", account_id, {"telephone1": "555-0100"})
383+
batch.records.get("account", account_id, select=["name"])
384+
batch.query.sql("SELECT TOP 5 name FROM account")
385+
386+
result = batch.execute()
387+
for item in result.responses:
388+
if item.is_success:
389+
print(f"[OK] {item.status_code} entity_id={item.entity_id}")
390+
if item.data:
391+
# GET responses populate item.data with the parsed JSON record
392+
print(item.data.get("name"))
393+
else:
394+
print(f"[ERR] {item.status_code}: {item.error_message}")
395+
396+
# Transactional changeset (all succeed or roll back)
397+
with batch.changeset() as cs:
398+
ref = cs.records.create("contact", {"firstname": "Alice"})
399+
cs.records.update("account", account_id, {"primarycontactid@odata.bind": ref})
400+
401+
# Continue on error
402+
result = batch.execute(continue_on_error=True)
403+
print(f"Succeeded: {len(result.succeeded)}, Failed: {len(result.failed)}")
404+
```
405+
406+
**BatchResult properties:**
407+
- `result.responses` -- list of `BatchItemResponse` in submission order
408+
- `result.succeeded` -- responses with 2xx status codes
409+
- `result.failed` -- responses with non-2xx status codes
410+
- `result.has_errors` -- True if any response failed
411+
- `result.entity_ids` -- GUIDs from OData-EntityId headers (creates and updates)
412+
413+
**Batch limitations:**
414+
- Maximum 1000 operations per batch
415+
- Paginated `records.get()` (without `record_id`) is not supported in batch
416+
- `flush_cache()` is not supported in batch
417+
373418
## Error Handling
374419

375420
The SDK provides structured exceptions with detailed error information:

README.md

Lines changed: 89 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -29,6 +29,7 @@ A Python client library for Microsoft Dataverse that provides a unified interfac
2929
- [Table management](#table-management)
3030
- [Relationship management](#relationship-management)
3131
- [File operations](#file-operations)
32+
- [Batch operations](#batch-operations)
3233
- [Next steps](#next-steps)
3334
- [Troubleshooting](#troubleshooting)
3435
- [Contributing](#contributing)
@@ -43,6 +44,7 @@ A Python client library for Microsoft Dataverse that provides a unified interfac
4344
- **🔗 Relationship Management**: Create one-to-many and many-to-many relationships between tables with full metadata control
4445
- **🐼 DataFrame Support**: Pandas wrappers for all CRUD operations, returning DataFrames and Series
4546
- **📎 File Operations**: Upload files to Dataverse file columns with automatic chunking for large files
47+
- **📦 Batch Operations**: Send multiple CRUD, table metadata, and SQL query operations in a single HTTP request with optional transactional changesets
4648
- **🔐 Azure Identity**: Built-in authentication using Azure Identity credential providers with comprehensive support
4749
- **🛡️ Error Handling**: Structured exception hierarchy with detailed error context and retry guidance
4850

@@ -115,9 +117,9 @@ The SDK provides a simple, pythonic interface for Dataverse operations:
115117

116118
| Concept | Description |
117119
|---------|-------------|
118-
| **DataverseClient** | Main entry point; provides `records`, `query`, `tables`, and `files` namespaces |
120+
| **DataverseClient** | Main entry point; provides `records`, `query`, `tables`, `files`, and `batch` namespaces |
119121
| **Context Manager** | Use `with DataverseClient(...) as client:` for automatic cleanup and HTTP connection pooling |
120-
| **Namespaces** | Operations are organized into `client.records` (CRUD & OData queries), `client.query` (QueryBuilder & SQL), `client.tables` (metadata), and `client.files` (file uploads) |
122+
| **Namespaces** | Operations are organized into `client.records` (CRUD & OData queries), `client.query` (QueryBuilder & SQL), `client.tables` (metadata), `client.files` (file uploads), and `client.batch` (batch requests) |
121123
| **Records** | Dataverse records represented as Python dictionaries with column schema names |
122124
| **Schema names** | Use table schema names (`"account"`, `"new_MyTestTable"`) and column schema names (`"name"`, `"new_MyTestColumn"`). See: [Table definitions in Microsoft Dataverse](https://learn.microsoft.com/en-us/power-apps/developer/data-platform/entity-metadata) |
123125
| **Bulk Operations** | Efficient bulk processing for multiple records with automatic optimization |
@@ -514,6 +516,90 @@ client.files.upload(
514516
)
515517
```
516518

519+
### Batch operations
520+
521+
Use `client.batch` to send multiple operations in one HTTP request. The batch namespace mirrors `client.records`, `client.tables`, and `client.query`.
522+
523+
```python
524+
# Build a batch request and add operations
525+
batch = client.batch.new()
526+
batch.records.create("account", {"name": "Contoso"})
527+
batch.records.create("account", [{"name": "Fabrikam"}, {"name": "Woodgrove"}])
528+
batch.records.update("account", account_id, {"telephone1": "555-0100"})
529+
batch.records.delete("account", old_id)
530+
batch.records.get("account", account_id, select=["name"])
531+
532+
result = batch.execute()
533+
for item in result.responses:
534+
if item.is_success:
535+
print(f"[OK] {item.status_code} entity_id={item.entity_id}")
536+
else:
537+
print(f"[ERR] {item.status_code}: {item.error_message}")
538+
```
539+
540+
**Transactional changeset** — all operations in a changeset succeed or roll back together:
541+
542+
```python
543+
batch = client.batch.new()
544+
with batch.changeset() as cs:
545+
lead_ref = cs.records.create("lead", {"firstname": "Ada"})
546+
contact_ref = cs.records.create("contact", {"firstname": "Ada"})
547+
cs.records.create("account", {
548+
"name": "Babbage & Co.",
549+
"originatingleadid@odata.bind": lead_ref,
550+
"primarycontactid@odata.bind": contact_ref,
551+
})
552+
result = batch.execute()
553+
print(f"Created {len(result.entity_ids)} records atomically")
554+
```
555+
556+
**Table metadata and SQL queries in a batch:**
557+
558+
```python
559+
batch = client.batch.new()
560+
batch.tables.create("new_Product", {"new_Price": "decimal", "new_InStock": "bool"})
561+
batch.tables.add_columns("new_Product", {"new_Rating": "int"})
562+
batch.tables.get("new_Product")
563+
batch.query.sql("SELECT TOP 5 name FROM account")
564+
565+
result = batch.execute()
566+
```
567+
568+
**Continue on error** — attempt all operations even when one fails:
569+
570+
```python
571+
result = batch.execute(continue_on_error=True)
572+
print(f"Succeeded: {len(result.succeeded)}, Failed: {len(result.failed)}")
573+
for item in result.failed:
574+
print(f"[ERR] {item.status_code}: {item.error_message}")
575+
```
576+
577+
**DataFrame integration** -- feed pandas DataFrames directly into a batch:
578+
579+
```python
580+
import pandas as pd
581+
582+
batch = client.batch.new()
583+
584+
# Create records from a DataFrame
585+
df = pd.DataFrame([{"name": "Contoso"}, {"name": "Fabrikam"}])
586+
batch.dataframe.create("account", df)
587+
588+
# Update records from a DataFrame
589+
updates = pd.DataFrame([
590+
{"accountid": id1, "telephone1": "555-0100"},
591+
{"accountid": id2, "telephone1": "555-0200"},
592+
])
593+
batch.dataframe.update("account", updates, id_column="accountid")
594+
595+
# Delete records from a Series
596+
batch.dataframe.delete("account", pd.Series([id1, id2]))
597+
598+
result = batch.execute()
599+
```
600+
601+
For a complete example see [examples/advanced/batch.py](https://github.com/microsoft/PowerPlatform-DataverseClient-Python/blob/main/examples/advanced/batch.py).
602+
517603
## Next steps
518604

519605
### More sample code
@@ -528,6 +614,7 @@ Explore our comprehensive examples in the [`examples/`](https://github.com/micro
528614
- **[Complete Walkthrough](https://github.com/microsoft/PowerPlatform-DataverseClient-Python/blob/main/examples/advanced/walkthrough.py)** - Full feature demonstration with production patterns
529615
- **[Relationship Management](https://github.com/microsoft/PowerPlatform-DataverseClient-Python/blob/main/examples/advanced/relationships.py)** - Create and manage table relationships
530616
- **[File Upload](https://github.com/microsoft/PowerPlatform-DataverseClient-Python/blob/main/examples/advanced/file_upload.py)** - Upload files to Dataverse file columns
617+
- **[Batch Operations](https://github.com/microsoft/PowerPlatform-DataverseClient-Python/blob/main/examples/advanced/batch.py)** - Send multiple operations in a single request with changesets
531618

532619
📖 See the [examples README](https://github.com/microsoft/PowerPlatform-DataverseClient-Python/blob/main/examples/README.md) for detailed guidance and learning progression.
533620

0 commit comments

Comments
 (0)