You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
chore: updated ai instruction set and readme (#789)
# Pull Request Description
## Summary
[Provide a brief description of the changes in this PR]
### Issue Reference
Fixes #[Issue Number]
### Motivation and Context
- Why is this change needed?
- What problem does it solve?
- If it fixes an open issue, please link to the issue here
### Dependencies
- List any dependencies that are required for this change
- Include any configuration changes needed
- Note any version updates required
## Type of Change
Please mark the relevant option with an `x`:
- [ ] 🐛 Bug fix (non-breaking change which fixes an issue)
- [ ] ✨ New feature (non-breaking change which adds functionality)
- [ ] 💥 Breaking change (fix or feature that would cause existing
functionality to not work as expected)
- [ ] 📝 Documentation update (Wiki/README/Code comments)
- [ ] ♻️ Refactor (code improvement without functional changes)
- [ ] 🎨 Style update (formatting, renaming)
- [ ] 🔧 Configuration change
- [ ] 📦 Dependency update
## Testing
- [ ] I have added unit tests that prove my fix is effective or that my
feature works
- [ ] New and existing unit tests pass locally with my changes
- [ ] I have tested this code in the following browsers/environments:
[list environments]
## Quality Checklist
- [ ] I have reviewed my own code before requesting review
- [ ] I have verified there are no other open Pull Requests for the same
update/change
- [ ] All CI/CD pipelines pass without errors or warnings
- [ ] My code follows the established style guidelines of this project
- [ ] I have added necessary documentation (if appropriate)
- [ ] I have commented my code, particularly in complex areas
- [ ] I have made corresponding changes to the README and other relevant
documentation
- [ ] My changes generate no new warnings
- [ ] I have performed a self-review of my own code
- [ ] My code is properly formatted according to project standards
## Screenshots/Recordings (if appropriate)
[Add screenshots or recordings that demonstrate the changes]
## Additional Notes
[Add any additional information that might be helpful for reviewers]
@@ -339,27 +339,56 @@ Use the Terraform plugin logger (`tflog`) for logging within resource implementa
339
339
340
340
## Testing Best Practices
341
341
342
-
-**Unit Tests:** For each new resource or data source, write unit tests covering all operations and edge cases. Use the `jarcoal/httpmock` library (already in the project) to simulate HTTP API responses.
343
-
- Register **mock responders** for every HTTP call that the Create, Read, Update, or Delete functions will make. Each test step should set up the expected API responses (e.g. mock the POST response for Create, GET for Read, etc.).
344
-
-**Test Steps Lifecycle:** Structure unit tests in sequential steps to simulate resource lifecycle transitions:
345
-
-**Step 1 (Create):** Call the resource's Create, then Read. Verify that after Create, the state read back includes all the created fields/attributes.
346
-
-**Step 2 (Update):** Call Read (to get current state), then Update, then Read again. Ensure the first Read in this step matches the final state from the previous step, and the final Read reflects the updates applied.
347
-
-**Step 3 (Delete):** Call Delete, then Read. After deletion, the final Read should return a "not found" error (e.g. 404) indicating the resource is gone.
348
-
- If the resource supports import, write a dedicated test (single step) that calls the Read (or Import) with a given `ImportStateId` and verifies Terraform state import logic.
349
-
- Include negative test cases: simulate API errors (like 403 Forbidden or 500 Internal Server Error) and ensure the provider surfaces appropriate errors. Also test validation logic (e.g., providing an invalid parameter returns an error).
350
-
- Place JSON fixtures for mock responses in the appropriate test data directory (e.g. `internal/resources/<service>/test/<resource>/<scenario>/response.json`). **Do not use real customer data** in tests – anonymize any IDs or personal info in your dummy data.
351
-
- Name unit test functions with the `TestUnit` prefix as mentioned, and keep them in a `_test.go` file using the `<package>_test` package name.
352
-
- All the JSON response for unit tests should be stored in .json files:
353
-
- Files should be placed in a folder with a name corresponding to the Unit Test that is being used. Folder name should omit `UnitTest` in its name.
354
-
- Each Unit Test folder with .json files should be stored at `resources\{service_name}\test\resource` or `services\{service_name}\test\datasource` with all other resource and/or datasource .go files.
355
-
- The .json file name should consist of the mock request method (`get`, `post`, `delete`) followed by `_` and name of the returned mock object name or action.
356
-
- The file names have to be sensible without empty spaces and special characters.
357
-
358
-
-**Acceptance Tests:** Add acceptance tests for any new resource covering the same scenarios as unit tests, but against real Microsoft365 resources. These tests live in files with the `TestAcc...` prefix and require real credentials.
359
-
-**IMPORTANT: If you don't have access to a test tenant, DO NOT modify, rename, or remove existing acceptance tests.** Focus exclusively on writing unit tests instead. Existing acceptance tests have been verified to work correctly and modifying them without the ability to test against a real Microsoft 365 environment can break the test suite.
360
-
- Wrap any acceptance test with appropriate pre-check functions and environment variable checks so it skips if not configured.
361
-
- Ensure each acceptance test cleans up after itself. Use `CheckDestroy` functions to verify that resources are actually deleted in Azure/Microsoft365 after the test run.
362
-
- Keep acceptance tests focused and isolated (use separate environment or resource names to avoid conflicts).
342
+
-**Unit Tests:** For each new resource or data source, write unit tests in `resource_test.go` covering all operations and edge cases. Use the `jarcoal/httpmock` library to simulate HTTP API responses with resource-specific mock responders.
343
+
-**Test File Organization:**
344
+
-**Unit Tests:** Place in `resource_test.go` with `TestUnit*` naming pattern
345
+
-**Package Naming:** Use `<package>_test` suffix (e.g., `graphBetaWindowsAutopilotDeploymentProfile_test`)
346
+
-**Mock Setup:** Create `setupMockEnvironment()` and `setupErrorMockEnvironment()` functions
347
+
-**Mock Responders:** Store in `mocks/responders.go` file within resource directory
348
+
-**Mock Infrastructure:**
349
+
- Register **mock responders** for every HTTP call using resource-specific mock structs
350
+
- Use `httpmock.Activate()` and `defer httpmock.DeactivateAndReset()` pattern
351
+
- Call `mocks.SetupUnitTestEnvironment(t)` for consistent test environment setup
352
+
- Use `mocks.TestUnitTestProtoV6ProviderFactories` for provider factories
353
+
-**Test Steps Lifecycle:** Structure unit tests using `resource.UnitTest()` with sequential steps:
354
+
-**Step 1 (Create):** Verify resource creation and initial state population
355
+
-**Step 2 (Update):** Test resource modification and state updates
356
+
-**Step 3 (Delete):** Confirm resource deletion and state cleanup
357
+
-**Import Step:** Test import functionality with `ImportState: true` and `ImportStateId`
358
+
-**Test Data Organization:**
359
+
- Store JSON mock responses in `tests/responses/[operation]/[method]_[object]_[scenario].json`
-**Do not use real customer data** – anonymize all IDs and personal information
363
+
- Include negative test cases with error mock responses and validation testing
364
+
- Use `testCheckExists()` helper function for resource existence verification
365
+
366
+
-**Acceptance Tests:** Add acceptance tests in `resource_acceptance_test.go` for new resources using real Microsoft365 API calls. These require valid authentication credentials and test environment.
367
+
-**Test File Organization:**
368
+
-**Acceptance Tests:** Place in `resource_acceptance_test.go` with `TestAcc*` naming pattern
369
+
-**Package Naming:** Use same `<package>_test` pattern as unit tests
370
+
-**Environment Setup:** Use `mocks.TestAccPreCheck(t)` for pre-flight checks
371
+
-**Provider Setup:** Use `mocks.TestAccProtoV6ProviderFactories` for provider factories
372
+
-**Centralized Test Infrastructure:**
373
+
- Import and use helpers from `internal/acceptance/` package
374
+
- Use `acceptance.TestGraphClient()` for direct Graph API client access when needed
375
+
- Leverage `acceptance.ProviderConfigBuilder` for dynamic provider configuration
376
+
- Use centralized environment variable validation and setup functions
377
+
-**Test Structure:**
378
+
- Use `resource.Test()` with real API calls against Microsoft 365
379
+
- Include `PreCheck: func() { mocks.TestAccPreCheck(t) }` for environment validation
380
+
- Add `CheckDestroy` functions to verify resource cleanup: `testAccCheck[ResourceName]Destroy`
381
+
- Use `ExternalProviders` map for dependencies like `azuread` and `random` providers
382
+
-**Test Scenarios:**
383
+
- Test minimal and maximal resource configurations
- Include assignment testing for resources with assignment capabilities
387
+
-**Environment Requirements:**
388
+
-**IMPORTANT: If you don't have access to a test tenant, DO NOT modify existing acceptance tests**
389
+
- Focus on unit tests only when lacking test environment access
390
+
- Acceptance tests require valid `M365_*` environment variables for authentication
391
+
- Use resource naming with random suffixes to avoid conflicts between parallel test runs
363
392
364
393
-**Test Coverage:** Aim for **at least 80%** code coverage for unit tests on new code. `make unittest` will return a coverage score by service and overall. Focus on the service that is currently being worked on when adding tests to improve coverage.
365
394
@@ -409,24 +438,36 @@ Use the Terraform plugin logger (`tflog`) for logging within resource implementa
Copy file name to clipboardExpand all lines: README.md
+1-1Lines changed: 1 addition & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -69,7 +69,7 @@ Please refer to the [Getting Started](https://registry.terraform.io/providers/de
69
69
70
70
## Provider Comparison
71
71
72
-
For information and a comparison between this provider in relation to the msft official terraform-provider-msgraph provider, see the [Provider Comparison](./docs/development/provider_comparison.md) documentation.
72
+
For information and a comparison between this provider in relation to the msft official `terraform-provider-msgraph provider`, see the [Provider Comparison](./docs/development/provider_comparison.md) documentation.
0 commit comments