Skip to content

Commit f2b363a

Browse files
committed
Merge upstream/main into feature/metadata
Incorporate latest changes from microsoft/PowerPlatform-DataverseClient-Python: - Batch API with changeset, upsert, and DataFrame integration (microsoft#129) - Optimize picklist label resolution with bulk fetch (microsoft#154) - Add memo/multiline column type support (microsoft#155) - Add unit test coverage and CI coverage reporting (microsoft#158) Made-with: Cursor
2 parents f660dc5 + 9cff47f commit f2b363a

38 files changed

Lines changed: 9812 additions & 352 deletions

.azdo/ci-pr.yaml

Lines changed: 18 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -42,7 +42,7 @@ extends:
4242

4343
- script: |
4444
python -m pip install --upgrade pip
45-
python -m pip install flake8 black build
45+
python -m pip install flake8 black build diff-cover
4646
python -m pip install -e .[dev]
4747
displayName: 'Install dependencies'
4848
@@ -60,18 +60,30 @@ extends:
6060
- script: |
6161
python -m build
6262
displayName: 'Build package'
63-
63+
6464
- script: |
6565
python -m pip install dist/*.whl
6666
displayName: 'Install wheel'
67-
67+
6868
- script: |
69-
pytest
69+
PYTHONPATH=src pytest --junitxml=test-results.xml --cov --cov-report=xml
7070
displayName: 'Test with pytest'
71-
71+
72+
- script: |
73+
git fetch origin main
74+
diff-cover coverage.xml --compare-branch=origin/main --fail-under=90
75+
displayName: 'Diff coverage (90% for new changes)'
76+
7277
- task: PublishTestResults@2
7378
condition: succeededOrFailed()
7479
inputs:
75-
testResultsFiles: '**/test-*.xml'
80+
testResultsFiles: '**/test-results.xml'
7681
testRunTitle: 'Python 3.12'
7782
displayName: 'Publish test results'
83+
84+
- task: PublishCodeCoverageResults@2
85+
condition: succeededOrFailed()
86+
inputs:
87+
summaryFileLocation: '**/coverage.xml'
88+
pathToSources: '$(Build.SourcesDirectory)/src'
89+
displayName: 'Publish code coverage'

.claude/skills/dataverse-sdk-dev/SKILL.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -13,7 +13,7 @@ This skill provides guidance for developers working on the PowerPlatform Dataver
1313

1414
### API Design
1515

16-
1. **Public methods in operation namespaces** - New public methods go in the appropriate namespace module under `src/PowerPlatform/Dataverse/operations/` (`records.py`, `query.py`, `tables.py`). The `client.py` file exposes these via namespace properties (`client.records`, `client.query`, `client.tables`). Public types and constants live in their own modules (e.g., `models/table_info.py`, `common/constants.py`)
16+
1. **Public methods in operation namespaces** - New public methods go in the appropriate namespace module under `src/PowerPlatform/Dataverse/operations/` (`records.py`, `query.py`, `tables.py`, `batch.py`). The `client.py` file exposes these via namespace properties (`client.records`, `client.query`, `client.tables`, `client.batch`). Public types and constants live in their own modules (e.g., `models/metadata.py`, `models/batch.py`, `common/constants.py`)
1717
2. **Every public method needs README example** - Public API methods must have examples in README.md
1818
3. **Reuse existing APIs** - Always check if an existing method can be used before making direct Web API calls
1919
4. **Update documentation** when adding features - Keep README and SKILL files (both copies) in sync

.claude/skills/dataverse-sdk-use/SKILL.md

Lines changed: 46 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -22,6 +22,7 @@ Use the PowerPlatform Dataverse Client Python SDK to interact with Microsoft Dat
2222
- `client.query` -- query and search operations
2323
- `client.tables` -- table metadata, columns, and relationships
2424
- `client.files` -- file upload operations
25+
- `client.batch` -- batch multiple operations into a single HTTP request
2526

2627
### Bulk Operations
2728
The SDK supports Dataverse's native bulk operations: Pass lists to `create()`, `update()` for automatic bulk processing, for `delete()`, set `use_bulk_delete` when passing lists to use bulk operation
@@ -249,6 +250,7 @@ table_info = client.tables.create(
249250
#### Supported Column Types
250251
Types on the same line map to the same exact format under the hood
251252
- `"string"` or `"text"` - Single line of text
253+
- `"memo"` or `"multiline"` - Multiple lines of text (4000 character default)
252254
- `"int"` or `"integer"` - Whole number
253255
- `"decimal"` or `"money"` - Decimal number
254256
- `"float"` or `"double"` - Floating point number
@@ -426,6 +428,50 @@ client.files.upload(
426428
)
427429
```
428430

431+
### Batch Operations
432+
433+
Use `client.batch` to send multiple operations in one HTTP request. All batch methods return `None`; results arrive via `BatchResult` after `execute()`.
434+
435+
```python
436+
# Build a batch request
437+
batch = client.batch.new()
438+
batch.records.create("account", {"name": "Contoso"})
439+
batch.records.update("account", account_id, {"telephone1": "555-0100"})
440+
batch.records.get("account", account_id, select=["name"])
441+
batch.query.sql("SELECT TOP 5 name FROM account")
442+
443+
result = batch.execute()
444+
for item in result.responses:
445+
if item.is_success:
446+
print(f"[OK] {item.status_code} entity_id={item.entity_id}")
447+
if item.data:
448+
# GET responses populate item.data with the parsed JSON record
449+
print(item.data.get("name"))
450+
else:
451+
print(f"[ERR] {item.status_code}: {item.error_message}")
452+
453+
# Transactional changeset (all succeed or roll back)
454+
with batch.changeset() as cs:
455+
ref = cs.records.create("contact", {"firstname": "Alice"})
456+
cs.records.update("account", account_id, {"primarycontactid@odata.bind": ref})
457+
458+
# Continue on error
459+
result = batch.execute(continue_on_error=True)
460+
print(f"Succeeded: {len(result.succeeded)}, Failed: {len(result.failed)}")
461+
```
462+
463+
**BatchResult properties:**
464+
- `result.responses` -- list of `BatchItemResponse` in submission order
465+
- `result.succeeded` -- responses with 2xx status codes
466+
- `result.failed` -- responses with non-2xx status codes
467+
- `result.has_errors` -- True if any response failed
468+
- `result.entity_ids` -- GUIDs from OData-EntityId headers (creates and updates)
469+
470+
**Batch limitations:**
471+
- Maximum 1000 operations per batch
472+
- Paginated `records.get()` (without `record_id`) is not supported in batch
473+
- `flush_cache()` is not supported in batch
474+
429475
## Error Handling
430476

431477
The SDK provides structured exceptions with detailed error information:

.github/workflows/python-package.yml

Lines changed: 25 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -18,6 +18,8 @@ jobs:
1818

1919
steps:
2020
- uses: actions/checkout@v4
21+
with:
22+
fetch-depth: 0
2123

2224
- name: Set up Python 3.12
2325
uses: actions/setup-python@v5
@@ -27,7 +29,7 @@ jobs:
2729
- name: Install dependencies
2830
run: |
2931
python -m pip install --upgrade pip
30-
python -m pip install flake8 black build
32+
python -m pip install flake8 black build diff-cover
3133
python -m pip install -e .[dev]
3234
3335
- name: Check format with black
@@ -44,11 +46,30 @@ jobs:
4446
- name: Build package
4547
run: |
4648
python -m build
47-
49+
4850
- name: Install wheel
4951
run: |
5052
python -m pip install dist/*.whl
51-
53+
5254
- name: Test with pytest
5355
run: |
54-
pytest
56+
PYTHONPATH=src pytest --junitxml=test-results.xml --cov --cov-report=xml
57+
58+
- name: Diff coverage (90% for new changes)
59+
run: |
60+
git fetch origin ${{ github.base_ref }}
61+
diff-cover coverage.xml --compare-branch=origin/${{ github.base_ref }} --fail-under=90
62+
63+
- name: Upload test results
64+
if: always()
65+
uses: actions/upload-artifact@v4
66+
with:
67+
name: test-results
68+
path: test-results.xml
69+
70+
- name: Upload coverage report
71+
if: always()
72+
uses: actions/upload-artifact@v4
73+
with:
74+
name: coverage-report
75+
path: coverage.xml

README.md

Lines changed: 91 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -29,6 +29,7 @@ A Python client library for Microsoft Dataverse that provides a unified interfac
2929
- [Table management](#table-management)
3030
- [Relationship management](#relationship-management)
3131
- [File operations](#file-operations)
32+
- [Batch operations](#batch-operations)
3233
- [Next steps](#next-steps)
3334
- [Troubleshooting](#troubleshooting)
3435
- [Contributing](#contributing)
@@ -43,6 +44,7 @@ A Python client library for Microsoft Dataverse that provides a unified interfac
4344
- **🔗 Relationship Management**: Create one-to-many and many-to-many relationships between tables with full metadata control
4445
- **🐼 DataFrame Support**: Pandas wrappers for all CRUD operations, returning DataFrames and Series
4546
- **📎 File Operations**: Upload files to Dataverse file columns with automatic chunking for large files
47+
- **📦 Batch Operations**: Send multiple CRUD, table metadata, and SQL query operations in a single HTTP request with optional transactional changesets
4648
- **🔐 Azure Identity**: Built-in authentication using Azure Identity credential providers with comprehensive support
4749
- **🛡️ Error Handling**: Structured exception hierarchy with detailed error context and retry guidance
4850

@@ -115,9 +117,9 @@ The SDK provides a simple, pythonic interface for Dataverse operations:
115117

116118
| Concept | Description |
117119
|---------|-------------|
118-
| **DataverseClient** | Main entry point; provides `records`, `query`, `tables`, and `files` namespaces |
120+
| **DataverseClient** | Main entry point; provides `records`, `query`, `tables`, `files`, and `batch` namespaces |
119121
| **Context Manager** | Use `with DataverseClient(...) as client:` for automatic cleanup and HTTP connection pooling |
120-
| **Namespaces** | Operations are organized into `client.records` (CRUD & OData queries), `client.query` (QueryBuilder & SQL), `client.tables` (metadata), and `client.files` (file uploads) |
122+
| **Namespaces** | Operations are organized into `client.records` (CRUD & OData queries), `client.query` (QueryBuilder & SQL), `client.tables` (metadata), `client.files` (file uploads), and `client.batch` (batch requests) |
121123
| **Records** | Dataverse records represented as Python dictionaries with column schema names |
122124
| **Schema names** | Use table schema names (`"account"`, `"new_MyTestTable"`) and column schema names (`"name"`, `"new_MyTestColumn"`). See: [Table definitions in Microsoft Dataverse](https://learn.microsoft.com/en-us/power-apps/developer/data-platform/entity-metadata) |
123125
| **Bulk Operations** | Efficient bulk processing for multiple records with automatic optimization |
@@ -402,6 +404,7 @@ for page in client.records.get(
402404
# Create a custom table, including the customization prefix value in the schema names for the table and columns.
403405
table_info = client.tables.create("new_Product", {
404406
"new_Code": "string",
407+
"new_Description": "memo",
405408
"new_Price": "decimal",
406409
"new_Active": "bool"
407410
})
@@ -547,6 +550,90 @@ client.files.upload(
547550
)
548551
```
549552

553+
### Batch operations
554+
555+
Use `client.batch` to send multiple operations in one HTTP request. The batch namespace mirrors `client.records`, `client.tables`, and `client.query`.
556+
557+
```python
558+
# Build a batch request and add operations
559+
batch = client.batch.new()
560+
batch.records.create("account", {"name": "Contoso"})
561+
batch.records.create("account", [{"name": "Fabrikam"}, {"name": "Woodgrove"}])
562+
batch.records.update("account", account_id, {"telephone1": "555-0100"})
563+
batch.records.delete("account", old_id)
564+
batch.records.get("account", account_id, select=["name"])
565+
566+
result = batch.execute()
567+
for item in result.responses:
568+
if item.is_success:
569+
print(f"[OK] {item.status_code} entity_id={item.entity_id}")
570+
else:
571+
print(f"[ERR] {item.status_code}: {item.error_message}")
572+
```
573+
574+
**Transactional changeset** — all operations in a changeset succeed or roll back together:
575+
576+
```python
577+
batch = client.batch.new()
578+
with batch.changeset() as cs:
579+
lead_ref = cs.records.create("lead", {"firstname": "Ada"})
580+
contact_ref = cs.records.create("contact", {"firstname": "Ada"})
581+
cs.records.create("account", {
582+
"name": "Babbage & Co.",
583+
"originatingleadid@odata.bind": lead_ref,
584+
"primarycontactid@odata.bind": contact_ref,
585+
})
586+
result = batch.execute()
587+
print(f"Created {len(result.entity_ids)} records atomically")
588+
```
589+
590+
**Table metadata and SQL queries in a batch:**
591+
592+
```python
593+
batch = client.batch.new()
594+
batch.tables.create("new_Product", {"new_Price": "decimal", "new_InStock": "bool"})
595+
batch.tables.add_columns("new_Product", {"new_Rating": "int"})
596+
batch.tables.get("new_Product")
597+
batch.query.sql("SELECT TOP 5 name FROM account")
598+
599+
result = batch.execute()
600+
```
601+
602+
**Continue on error** — attempt all operations even when one fails:
603+
604+
```python
605+
result = batch.execute(continue_on_error=True)
606+
print(f"Succeeded: {len(result.succeeded)}, Failed: {len(result.failed)}")
607+
for item in result.failed:
608+
print(f"[ERR] {item.status_code}: {item.error_message}")
609+
```
610+
611+
**DataFrame integration** -- feed pandas DataFrames directly into a batch:
612+
613+
```python
614+
import pandas as pd
615+
616+
batch = client.batch.new()
617+
618+
# Create records from a DataFrame
619+
df = pd.DataFrame([{"name": "Contoso"}, {"name": "Fabrikam"}])
620+
batch.dataframe.create("account", df)
621+
622+
# Update records from a DataFrame
623+
updates = pd.DataFrame([
624+
{"accountid": id1, "telephone1": "555-0100"},
625+
{"accountid": id2, "telephone1": "555-0200"},
626+
])
627+
batch.dataframe.update("account", updates, id_column="accountid")
628+
629+
# Delete records from a Series
630+
batch.dataframe.delete("account", pd.Series([id1, id2]))
631+
632+
result = batch.execute()
633+
```
634+
635+
For a complete example see [examples/advanced/batch.py](https://github.com/microsoft/PowerPlatform-DataverseClient-Python/blob/main/examples/advanced/batch.py).
636+
550637
## Next steps
551638

552639
### More sample code
@@ -561,6 +648,7 @@ Explore our comprehensive examples in the [`examples/`](https://github.com/micro
561648
- **[Complete Walkthrough](https://github.com/microsoft/PowerPlatform-DataverseClient-Python/blob/main/examples/advanced/walkthrough.py)** - Full feature demonstration with production patterns
562649
- **[Relationship Management](https://github.com/microsoft/PowerPlatform-DataverseClient-Python/blob/main/examples/advanced/relationships.py)** - Create and manage table relationships
563650
- **[File Upload](https://github.com/microsoft/PowerPlatform-DataverseClient-Python/blob/main/examples/advanced/file_upload.py)** - Upload files to Dataverse file columns
651+
- **[Batch Operations](https://github.com/microsoft/PowerPlatform-DataverseClient-Python/blob/main/examples/advanced/batch.py)** - Send multiple operations in a single request with changesets
564652

565653
📖 See the [examples README](https://github.com/microsoft/PowerPlatform-DataverseClient-Python/blob/main/examples/README.md) for detailed guidance and learning progression.
566654

@@ -621,7 +709,7 @@ For optimal performance in production environments:
621709
### Limitations
622710

623711
- SQL queries are **read-only** and support a limited subset of SQL syntax
624-
- Create Table supports a limited number of column types (string, int, decimal, bool, datetime, picklist)
712+
- Create Table supports the following column types: string, memo, int, decimal, float, bool, datetime, file, and picklist (Enum subclass)
625713
- File uploads are limited by Dataverse file size restrictions (default 128MB per file)
626714

627715
## Contributing

0 commit comments

Comments
 (0)