Skip to content

Commit 2f22009

Browse files
Samson Gebreclaude
andcommitted
Merge main into copilot/add-public-api-for-metadata
Resolved conflicts in _odata.py (kept write-blocking + guardrails from HEAD, adopted _execute_raw/_build_sql pattern from main for proper URL encoding; merged all error code imports) and test_sql_parse.py (kept both JOIN extraction tests from HEAD and _build_sql URL encoding tests from main). Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2 parents e129eed + 78cd852 commit 2f22009

25 files changed

Lines changed: 7239 additions & 332 deletions

File tree

.claude/skills/dataverse-sdk-dev/SKILL.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -13,7 +13,7 @@ This skill provides guidance for developers working on the PowerPlatform Dataver
1313

1414
### API Design
1515

16-
1. **Public methods in operation namespaces** - New public methods go in the appropriate namespace module under `src/PowerPlatform/Dataverse/operations/` (`records.py`, `query.py`, `tables.py`). The `client.py` file exposes these via namespace properties (`client.records`, `client.query`, `client.tables`). Public types and constants live in their own modules (e.g., `models/metadata.py`, `common/constants.py`)
16+
1. **Public methods in operation namespaces** - New public methods go in the appropriate namespace module under `src/PowerPlatform/Dataverse/operations/` (`records.py`, `query.py`, `tables.py`, `batch.py`). The `client.py` file exposes these via namespace properties (`client.records`, `client.query`, `client.tables`, `client.batch`). Public types and constants live in their own modules (e.g., `models/metadata.py`, `models/batch.py`, `common/constants.py`)
1717
2. **Every public method needs README example** - Public API methods must have examples in README.md
1818
3. **Reuse existing APIs** - Always check if an existing method can be used before making direct Web API calls
1919
4. **Update documentation** when adding features - Keep README and SKILL files (both copies) in sync

.claude/skills/dataverse-sdk-use/SKILL.md

Lines changed: 45 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -22,6 +22,7 @@ Use the PowerPlatform Dataverse Client Python SDK to interact with Microsoft Dat
2222
- `client.query` -- query and search operations
2323
- `client.tables` -- table metadata, columns, and relationships
2424
- `client.files` -- file upload operations
25+
- `client.batch` -- batch multiple operations into a single HTTP request
2526

2627
### Bulk Operations
2728
The SDK supports Dataverse's native bulk operations: Pass lists to `create()`, `update()` for automatic bulk processing, for `delete()`, set `use_bulk_delete` when passing lists to use bulk operation
@@ -369,6 +370,50 @@ client.files.upload(
369370
)
370371
```
371372

373+
### Batch Operations
374+
375+
Use `client.batch` to send multiple operations in one HTTP request. All batch methods return `None`; results arrive via `BatchResult` after `execute()`.
376+
377+
```python
378+
# Build a batch request
379+
batch = client.batch.new()
380+
batch.records.create("account", {"name": "Contoso"})
381+
batch.records.update("account", account_id, {"telephone1": "555-0100"})
382+
batch.records.get("account", account_id, select=["name"])
383+
batch.query.sql("SELECT TOP 5 name FROM account")
384+
385+
result = batch.execute()
386+
for item in result.responses:
387+
if item.is_success:
388+
print(f"[OK] {item.status_code} entity_id={item.entity_id}")
389+
if item.data:
390+
# GET responses populate item.data with the parsed JSON record
391+
print(item.data.get("name"))
392+
else:
393+
print(f"[ERR] {item.status_code}: {item.error_message}")
394+
395+
# Transactional changeset (all succeed or roll back)
396+
with batch.changeset() as cs:
397+
ref = cs.records.create("contact", {"firstname": "Alice"})
398+
cs.records.update("account", account_id, {"primarycontactid@odata.bind": ref})
399+
400+
# Continue on error
401+
result = batch.execute(continue_on_error=True)
402+
print(f"Succeeded: {len(result.succeeded)}, Failed: {len(result.failed)}")
403+
```
404+
405+
**BatchResult properties:**
406+
- `result.responses` -- list of `BatchItemResponse` in submission order
407+
- `result.succeeded` -- responses with 2xx status codes
408+
- `result.failed` -- responses with non-2xx status codes
409+
- `result.has_errors` -- True if any response failed
410+
- `result.entity_ids` -- GUIDs from OData-EntityId headers (creates and updates)
411+
412+
**Batch limitations:**
413+
- Maximum 1000 operations per batch
414+
- Paginated `records.get()` (without `record_id`) is not supported in batch
415+
- `flush_cache()` is not supported in batch
416+
372417
## Error Handling
373418

374419
The SDK provides structured exceptions with detailed error information:

README.md

Lines changed: 89 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -29,6 +29,7 @@ A Python client library for Microsoft Dataverse that provides a unified interfac
2929
- [Table management](#table-management)
3030
- [Relationship management](#relationship-management)
3131
- [File operations](#file-operations)
32+
- [Batch operations](#batch-operations)
3233
- [Next steps](#next-steps)
3334
- [Troubleshooting](#troubleshooting)
3435
- [Contributing](#contributing)
@@ -43,6 +44,7 @@ A Python client library for Microsoft Dataverse that provides a unified interfac
4344
- **🔗 Relationship Management**: Create one-to-many and many-to-many relationships between tables with full metadata control
4445
- **🐼 DataFrame Support**: Pandas wrappers for all CRUD operations, returning DataFrames and Series
4546
- **📎 File Operations**: Upload files to Dataverse file columns with automatic chunking for large files
47+
- **📦 Batch Operations**: Send multiple CRUD, table metadata, and SQL query operations in a single HTTP request with optional transactional changesets
4648
- **🔐 Azure Identity**: Built-in authentication using Azure Identity credential providers with comprehensive support
4749
- **🛡️ Error Handling**: Structured exception hierarchy with detailed error context and retry guidance
4850

@@ -115,9 +117,9 @@ The SDK provides a simple, pythonic interface for Dataverse operations:
115117

116118
| Concept | Description |
117119
|---------|-------------|
118-
| **DataverseClient** | Main entry point; provides `records`, `query`, `tables`, and `files` namespaces |
120+
| **DataverseClient** | Main entry point; provides `records`, `query`, `tables`, `files`, and `batch` namespaces |
119121
| **Context Manager** | Use `with DataverseClient(...) as client:` for automatic cleanup and HTTP connection pooling |
120-
| **Namespaces** | Operations are organized into `client.records` (CRUD & OData queries), `client.query` (QueryBuilder & SQL), `client.tables` (metadata), and `client.files` (file uploads) |
122+
| **Namespaces** | Operations are organized into `client.records` (CRUD & OData queries), `client.query` (QueryBuilder & SQL), `client.tables` (metadata), `client.files` (file uploads), and `client.batch` (batch requests) |
121123
| **Records** | Dataverse records represented as Python dictionaries with column schema names |
122124
| **Schema names** | Use table schema names (`"account"`, `"new_MyTestTable"`) and column schema names (`"name"`, `"new_MyTestColumn"`). See: [Table definitions in Microsoft Dataverse](https://learn.microsoft.com/en-us/power-apps/developer/data-platform/entity-metadata) |
123125
| **Bulk Operations** | Efficient bulk processing for multiple records with automatic optimization |
@@ -592,6 +594,90 @@ client.files.upload(
592594
)
593595
```
594596

597+
### Batch operations
598+
599+
Use `client.batch` to send multiple operations in one HTTP request. The batch namespace mirrors `client.records`, `client.tables`, and `client.query`.
600+
601+
```python
602+
# Build a batch request and add operations
603+
batch = client.batch.new()
604+
batch.records.create("account", {"name": "Contoso"})
605+
batch.records.create("account", [{"name": "Fabrikam"}, {"name": "Woodgrove"}])
606+
batch.records.update("account", account_id, {"telephone1": "555-0100"})
607+
batch.records.delete("account", old_id)
608+
batch.records.get("account", account_id, select=["name"])
609+
610+
result = batch.execute()
611+
for item in result.responses:
612+
if item.is_success:
613+
print(f"[OK] {item.status_code} entity_id={item.entity_id}")
614+
else:
615+
print(f"[ERR] {item.status_code}: {item.error_message}")
616+
```
617+
618+
**Transactional changeset** — all operations in a changeset succeed or roll back together:
619+
620+
```python
621+
batch = client.batch.new()
622+
with batch.changeset() as cs:
623+
lead_ref = cs.records.create("lead", {"firstname": "Ada"})
624+
contact_ref = cs.records.create("contact", {"firstname": "Ada"})
625+
cs.records.create("account", {
626+
"name": "Babbage & Co.",
627+
"originatingleadid@odata.bind": lead_ref,
628+
"primarycontactid@odata.bind": contact_ref,
629+
})
630+
result = batch.execute()
631+
print(f"Created {len(result.entity_ids)} records atomically")
632+
```
633+
634+
**Table metadata and SQL queries in a batch:**
635+
636+
```python
637+
batch = client.batch.new()
638+
batch.tables.create("new_Product", {"new_Price": "decimal", "new_InStock": "bool"})
639+
batch.tables.add_columns("new_Product", {"new_Rating": "int"})
640+
batch.tables.get("new_Product")
641+
batch.query.sql("SELECT TOP 5 name FROM account")
642+
643+
result = batch.execute()
644+
```
645+
646+
**Continue on error** — attempt all operations even when one fails:
647+
648+
```python
649+
result = batch.execute(continue_on_error=True)
650+
print(f"Succeeded: {len(result.succeeded)}, Failed: {len(result.failed)}")
651+
for item in result.failed:
652+
print(f"[ERR] {item.status_code}: {item.error_message}")
653+
```
654+
655+
**DataFrame integration** -- feed pandas DataFrames directly into a batch:
656+
657+
```python
658+
import pandas as pd
659+
660+
batch = client.batch.new()
661+
662+
# Create records from a DataFrame
663+
df = pd.DataFrame([{"name": "Contoso"}, {"name": "Fabrikam"}])
664+
batch.dataframe.create("account", df)
665+
666+
# Update records from a DataFrame
667+
updates = pd.DataFrame([
668+
{"accountid": id1, "telephone1": "555-0100"},
669+
{"accountid": id2, "telephone1": "555-0200"},
670+
])
671+
batch.dataframe.update("account", updates, id_column="accountid")
672+
673+
# Delete records from a Series
674+
batch.dataframe.delete("account", pd.Series([id1, id2]))
675+
676+
result = batch.execute()
677+
```
678+
679+
For a complete example see [examples/advanced/batch.py](https://github.com/microsoft/PowerPlatform-DataverseClient-Python/blob/main/examples/advanced/batch.py).
680+
595681
## Next steps
596682

597683
### More sample code
@@ -606,6 +692,7 @@ Explore our comprehensive examples in the [`examples/`](https://github.com/micro
606692
- **[Complete Walkthrough](https://github.com/microsoft/PowerPlatform-DataverseClient-Python/blob/main/examples/advanced/walkthrough.py)** - Full feature demonstration with production patterns
607693
- **[Relationship Management](https://github.com/microsoft/PowerPlatform-DataverseClient-Python/blob/main/examples/advanced/relationships.py)** - Create and manage table relationships
608694
- **[File Upload](https://github.com/microsoft/PowerPlatform-DataverseClient-Python/blob/main/examples/advanced/file_upload.py)** - Upload files to Dataverse file columns
695+
- **[Batch Operations](https://github.com/microsoft/PowerPlatform-DataverseClient-Python/blob/main/examples/advanced/batch.py)** - Send multiple operations in a single request with changesets
609696

610697
📖 See the [examples README](https://github.com/microsoft/PowerPlatform-DataverseClient-Python/blob/main/examples/README.md) for detailed guidance and learning progression.
611698

0 commit comments

Comments
 (0)