Commit e7a061a
* fix: correct onnx attributeproto field numbers per spec
Changed field numbers to match ONNX protobuf specification:
- Field 20 for type (was field 3)
- Field 3 for int value (was field 4)
- Field 2 for float value (was field 5)
- Field 4 for string value (was field 6)
- Field 8 for repeated ints (unchanged, was correct)
This prevents corrupt ONNX attributes when exporting models.
Fixes critical code review issue #4 from PR #424.
Generated with Claude Code
Co-Authored-By: Claude <noreply@anthropic.com>
* fix: preserve coreml-specific configuration during export
CoreMLExporter was converting CoreMLConfiguration to generic ExportConfiguration,
losing CoreML-specific settings like ComputeUnits, MinimumDeploymentTarget,
SpecVersion, InputFeatures, OutputFeatures, and FlexibleInputShapes.
This fix:
- Stores original CoreMLConfiguration in PlatformSpecificOptions during ExportToCoreML
- Retrieves preserved configuration in ConvertOnnxToCoreML
- Falls back to creating default config for backward compatibility
Addresses PR #424 review comment: exporter drops CoreML-specific configuration
* fix: add explicit null guard for directory creation
Added production-ready null handling for Path.GetDirectoryName edge cases:
- Explicit null check before directory operations
- Changed IsNullOrEmpty to IsNullOrWhiteSpace for better validation
- Added clarifying comments about edge cases (root paths, relative filenames)
- Documented fallback behavior when directory is null/empty
Addresses PR #424 review comment: null directory edge case handling
* fix: use constraint-free hash computation in modelcache
Replaced Marshal.SizeOf/Buffer.BlockCopy hashing with GetHashCode-based approach:
- Removed requirement for T : unmanaged constraint
- Uses unchecked hash combining with prime multipliers (17, 31)
- Samples large arrays (max 100 elements) for performance
- Includes array length and last element for better distribution
- Proper null handling for reference types
This allows ModelCache to work with any numeric type without cascading
constraint requirements through DeploymentRuntime, PredictionModelResult,
and dozens of other classes.
Addresses PR #424 review comment: ModelCache T constraint for hashing semantics
* fix: correct event ordering in telemetrycollector getevents
Fixed incorrect ordering logic where Take(limit) was applied before
OrderByDescending(timestamp), causing arbitrary events to be returned
instead of the most recent ones.
Changed:
- _events.Take(limit).OrderByDescending(e => e.Timestamp)
To:
- _events.OrderByDescending(e => e.Timestamp).Take(limit)
This ensures the method returns the MOST RECENT events as intended,
not random events from the ConcurrentBag.
Added clarifying documentation explaining the fix and return value semantics.
Addresses PR #424 review comment: GetEvents ordering issue
* fix: add comprehensive validation for tensorrt configuration
Added production-ready validation to prevent invalid TensorRT configurations:
1. ForInt8() method validation:
- Throws ArgumentNullException if calibration data path is null/whitespace
- Ensures INT8 configurations always have calibration data
2. New Validate() method checks:
- INT8 enabled requires non-empty CalibrationDataPath
- Calibration data file exists if path is provided
- MaxBatchSize >= 1
- MaxWorkspaceSize >= 0
- BuilderOptimizationLevel in valid range [0-5]
- NumStreams >= 1 when EnableMultiStream is true
This prevents runtime failures from misconfigured TensorRT engines,
especially the critical INT8 without calibration data scenario.
Addresses PR #424 review comment: TensorRTConfiguration calibration data validation
* fix: add bounds checking for inputsize/outputsize casts in coreml proto
Validate InputSize and OutputSize are non-negative before casting to ulong to prevent
negative values from wrapping to large unsigned values in CoreML protobuf serialization.
* fix: add production-ready onnx parsing with type validation and correct shape extraction
This commit fixes three critical issues in ONNX→CoreML conversion:
1. **Data type validation in ParseTensor**: Now reads and validates the data_type field
(field 5), ensuring only FLOAT tensors are converted. Throws NotSupportedException
for unsupported types (DOUBLE, INT8, etc.) instead of silently corrupting data.
2. **Correct TypeProto parsing**: Fixed ParseTypeProto to properly handle nested ONNX
protobuf structure (TypeProto → tensor_type → shape → dim → dim_value) instead of
incorrectly treating every varint as a dimension. This fixes tensor shape extraction
for model inputs/outputs.
3. **Accurate InnerProduct layer sizing**: Changed from Math.Sqrt approximation (which
assumed square matrices) to using actual tensor shape from ONNX dims. For MatMul/Gemm
layers, correctly extracts [out_dim, in_dim] from weight tensor shape.
Technical changes:
- ParseTensor now returns OnnxTensor with Name, Data, and Shape fields
- Added OnnxTensor class to store tensor metadata alongside float data
- Updated OnnxGraphInfo.Initializers from Dictionary<string, float[]> to Dictionary<string, OnnxTensor>
- Added ParseTensorTypeProto, ParseTensorShapeProto, and ParseDimensionProto helper methods
- ConvertOperatorToLayer uses shape[0] and shape[1] for layer sizing with sqrt fallback
* fix: preserve all configuration properties across cloning and deserialization
This ensures deployment behavior, model adaptation capabilities, and training history
are maintained when copying or reloading models.
Updated three methods:
1. WithParameters: Now passes LoRAConfiguration, CrossValidationResult, AgentConfig,
AgentRecommendation, and DeploymentConfiguration to constructor
2. DeepCopy: Same as WithParameters for consistency
3. Deserialize: Now assigns all RAG components (RagRetriever, RagReranker, RagGenerator,
QueryProcessors) and configuration properties (LoRAConfiguration, CrossValidationResult,
AgentConfig, AgentRecommendation, DeploymentConfiguration) from deserialized object
This fixes the issue where deployment/export/runtime settings, LoRA configurations, and
meta-learning properties were lost when calling WithParameters, DeepCopy, or Deserialize.
* fix: correct onnx field numbers and address pr review comments
CRITICAL: Fix ONNX TensorProto field number compliance:
- OnnxProto.cs: Change field 3 → 8 for tensor name per ONNX spec
- OnnxToCoreMLConverter.cs: Fix all TensorProto fields (1=dims, 2=data_type, 8=name, 9=raw_data)
- Previous incorrect field numbers would cause empty tensor names and broken shape inference
Additional fixes:
- CoreMLExporter.cs: Fix QuantizationBits mapping (Int8→8, Float16→16, default→32)
- TensorRTConfiguration.cs: Use ArgumentException instead of ArgumentNullException for whitespace validation
- ModelExporterBase.cs: Remove redundant null check (IsNullOrWhiteSpace handles null)
Addresses PR #486 review comments #1, #2, #4, #5, #6
🤖 Generated with [Claude Code](https://claude.com/claude-code)
Co-Authored-By: Claude <noreply@anthropic.com>
* style: use ternary operator for coreml config assignment
Simplify CoreMLExporter.cs by using ternary conditional operator instead of if/else for CoreMLConfiguration assignment.
Addresses PR #486 review comment #5
🤖 Generated with [Claude Code](https://claude.com/claude-code)
Co-Authored-By: Claude <noreply@anthropic.com>
* fix: replace gethashcode with sha256 for model cache correctness
CRITICAL: Model caching requires cryptographically secure hashing to prevent hash collisions that would cause incorrect predictions.
Previous GetHashCode() approach issues:
- Hash collision probability ~2^-32 (unacceptable for ML inference)
- Non-deterministic across .NET runtimes, machines, and process restarts
- Sampled only 100 elements from large arrays (incomplete hashing)
- Could return same cache entry for different inputs (silent data corruption)
SHA256-based approach:
- Collision probability ~2^-256 (cryptographically secure)
- Deterministic and stable across all platforms and runtimes
- Hashes ALL array elements for complete correctness
- Ensures cached results always match the correct input
Performance impact: SHA256 hashing adds microseconds, inference takes milliseconds/seconds - the overhead is negligible compared to model inference time.
This fix prioritizes correctness over premature optimization. For production ML systems, silent data corruption from hash collisions is unacceptable.
Addresses PR #486 review comment #3
🤖 Generated with [Claude Code](https://claude.com/claude-code)
Co-Authored-By: Claude <noreply@anthropic.com>
---------
Co-authored-by: Claude <noreply@anthropic.com>
1 parent f6e4cb2 commit e7a061a
File tree
5 files changed
+191
-27
lines changed- src
- Deployment
- Export/Onnx
- Mobile/CoreML
- TensorRT
- Models/Results
5 files changed
+191
-27
lines changed| Original file line number | Diff line number | Diff line change | |
|---|---|---|---|
| |||
353 | 353 | | |
354 | 354 | | |
355 | 355 | | |
356 | | - | |
357 | | - | |
| 356 | + | |
| 357 | + | |
358 | 358 | | |
359 | 359 | | |
360 | 360 | | |
| |||
| Original file line number | Diff line number | Diff line change | |
|---|---|---|---|
| |||
263 | 263 | | |
264 | 264 | | |
265 | 265 | | |
| 266 | + | |
| 267 | + | |
| 268 | + | |
| 269 | + | |
| 270 | + | |
| 271 | + | |
266 | 272 | | |
267 | 273 | | |
268 | 274 | | |
| |||
| Original file line number | Diff line number | Diff line change | |
|---|---|---|---|
| |||
75 | 75 | | |
76 | 76 | | |
77 | 77 | | |
78 | | - | |
79 | | - | |
| 78 | + | |
| 79 | + | |
80 | 80 | | |
81 | 81 | | |
82 | 82 | | |
| |||
128 | 128 | | |
129 | 129 | | |
130 | 130 | | |
131 | | - | |
| 131 | + | |
132 | 132 | | |
133 | 133 | | |
134 | 134 | | |
135 | 135 | | |
136 | 136 | | |
137 | 137 | | |
| 138 | + | |
| 139 | + | |
138 | 140 | | |
139 | 141 | | |
140 | 142 | | |
| |||
143 | 145 | | |
144 | 146 | | |
145 | 147 | | |
146 | | - | |
| 148 | + | |
| 149 | + | |
| 150 | + | |
| 151 | + | |
| 152 | + | |
| 153 | + | |
| 154 | + | |
147 | 155 | | |
148 | 156 | | |
149 | 157 | | |
150 | 158 | | |
151 | | - | |
| 159 | + | |
| 160 | + | |
| 161 | + | |
| 162 | + | |
| 163 | + | |
| 164 | + | |
| 165 | + | |
| 166 | + | |
| 167 | + | |
| 168 | + | |
| 169 | + | |
| 170 | + | |
| 171 | + | |
| 172 | + | |
| 173 | + | |
| 174 | + | |
| 175 | + | |
152 | 176 | | |
153 | 177 | | |
154 | 178 | | |
155 | 179 | | |
156 | 180 | | |
157 | 181 | | |
158 | 182 | | |
159 | | - | |
| 183 | + | |
| 184 | + | |
| 185 | + | |
| 186 | + | |
| 187 | + | |
| 188 | + | |
160 | 189 | | |
161 | 190 | | |
162 | 191 | | |
| |||
191 | 220 | | |
192 | 221 | | |
193 | 222 | | |
194 | | - | |
| 223 | + | |
195 | 224 | | |
196 | 225 | | |
197 | 226 | | |
| |||
200 | 229 | | |
201 | 230 | | |
202 | 231 | | |
203 | | - | |
| 232 | + | |
| 233 | + | |
| 234 | + | |
204 | 235 | | |
205 | | - | |
206 | | - | |
| 236 | + | |
| 237 | + | |
207 | 238 | | |
208 | 239 | | |
209 | 240 | | |
| |||
214 | 245 | | |
215 | 246 | | |
216 | 247 | | |
| 248 | + | |
| 249 | + | |
| 250 | + | |
| 251 | + | |
| 252 | + | |
| 253 | + | |
| 254 | + | |
| 255 | + | |
| 256 | + | |
| 257 | + | |
| 258 | + | |
| 259 | + | |
| 260 | + | |
| 261 | + | |
| 262 | + | |
| 263 | + | |
| 264 | + | |
| 265 | + | |
| 266 | + | |
| 267 | + | |
| 268 | + | |
| 269 | + | |
| 270 | + | |
| 271 | + | |
| 272 | + | |
| 273 | + | |
| 274 | + | |
| 275 | + | |
| 276 | + | |
| 277 | + | |
| 278 | + | |
| 279 | + | |
| 280 | + | |
| 281 | + | |
| 282 | + | |
| 283 | + | |
| 284 | + | |
| 285 | + | |
| 286 | + | |
| 287 | + | |
| 288 | + | |
| 289 | + | |
| 290 | + | |
| 291 | + | |
| 292 | + | |
| 293 | + | |
| 294 | + | |
| 295 | + | |
| 296 | + | |
| 297 | + | |
| 298 | + | |
| 299 | + | |
| 300 | + | |
| 301 | + | |
| 302 | + | |
| 303 | + | |
| 304 | + | |
| 305 | + | |
| 306 | + | |
| 307 | + | |
| 308 | + | |
| 309 | + | |
| 310 | + | |
| 311 | + | |
| 312 | + | |
| 313 | + | |
| 314 | + | |
| 315 | + | |
| 316 | + | |
| 317 | + | |
| 318 | + | |
| 319 | + | |
| 320 | + | |
| 321 | + | |
| 322 | + | |
| 323 | + | |
| 324 | + | |
| 325 | + | |
| 326 | + | |
| 327 | + | |
| 328 | + | |
| 329 | + | |
217 | 330 | | |
218 | 331 | | |
219 | 332 | | |
| |||
285 | 398 | | |
286 | 399 | | |
287 | 400 | | |
288 | | - | |
| 401 | + | |
289 | 402 | | |
290 | 403 | | |
291 | 404 | | |
| |||
303 | 416 | | |
304 | 417 | | |
305 | 418 | | |
306 | | - | |
| 419 | + | |
307 | 420 | | |
308 | | - | |
309 | | - | |
310 | | - | |
| 421 | + | |
| 422 | + | |
| 423 | + | |
| 424 | + | |
| 425 | + | |
| 426 | + | |
| 427 | + | |
| 428 | + | |
| 429 | + | |
| 430 | + | |
| 431 | + | |
| 432 | + | |
| 433 | + | |
| 434 | + | |
| 435 | + | |
| 436 | + | |
311 | 437 | | |
312 | 438 | | |
313 | 439 | | |
314 | 440 | | |
315 | | - | |
| 441 | + | |
316 | 442 | | |
317 | | - | |
| 443 | + | |
318 | 444 | | |
319 | 445 | | |
320 | 446 | | |
| |||
349 | 475 | | |
350 | 476 | | |
351 | 477 | | |
352 | | - | |
| 478 | + | |
353 | 479 | | |
354 | 480 | | |
355 | 481 | | |
| |||
373 | 499 | | |
374 | 500 | | |
375 | 501 | | |
| 502 | + | |
| 503 | + | |
| 504 | + | |
| 505 | + | |
| 506 | + | |
| 507 | + | |
| 508 | + | |
| 509 | + | |
| 510 | + | |
| 511 | + | |
| Original file line number | Diff line number | Diff line change | |
|---|---|---|---|
| |||
157 | 157 | | |
158 | 158 | | |
159 | 159 | | |
160 | | - | |
| 160 | + | |
161 | 161 | | |
162 | 162 | | |
163 | 163 | | |
164 | | - | |
165 | | - | |
| 164 | + | |
166 | 165 | | |
167 | 166 | | |
168 | 167 | | |
| |||
| Original file line number | Diff line number | Diff line change | |
|---|---|---|---|
| |||
995 | 995 | | |
996 | 996 | | |
997 | 997 | | |
998 | | - | |
| 998 | + | |
| 999 | + | |
999 | 1000 | | |
1000 | 1001 | | |
1001 | 1002 | | |
| |||
1004 | 1005 | | |
1005 | 1006 | | |
1006 | 1007 | | |
1007 | | - | |
| 1008 | + | |
| 1009 | + | |
| 1010 | + | |
| 1011 | + | |
| 1012 | + | |
| 1013 | + | |
1008 | 1014 | | |
1009 | 1015 | | |
1010 | 1016 | | |
| |||
1090 | 1096 | | |
1091 | 1097 | | |
1092 | 1098 | | |
1093 | | - | |
| 1099 | + | |
| 1100 | + | |
1094 | 1101 | | |
1095 | 1102 | | |
1096 | 1103 | | |
| |||
1099 | 1106 | | |
1100 | 1107 | | |
1101 | 1108 | | |
1102 | | - | |
| 1109 | + | |
| 1110 | + | |
| 1111 | + | |
| 1112 | + | |
| 1113 | + | |
| 1114 | + | |
1103 | 1115 | | |
1104 | 1116 | | |
1105 | 1117 | | |
| |||
1218 | 1230 | | |
1219 | 1231 | | |
1220 | 1232 | | |
| 1233 | + | |
| 1234 | + | |
| 1235 | + | |
| 1236 | + | |
| 1237 | + | |
| 1238 | + | |
| 1239 | + | |
| 1240 | + | |
| 1241 | + | |
| 1242 | + | |
| 1243 | + | |
1221 | 1244 | | |
1222 | 1245 | | |
1223 | 1246 | | |
| |||
0 commit comments