diff --git a/.ai/plan.md b/.ai/plan.md
new file mode 100644
index 0000000..cced15c
--- /dev/null
+++ b/.ai/plan.md
@@ -0,0 +1,266 @@
+# Implementation Plan: StraightFour Entities and OMI glTF Extensions Analysis
+
+## 1. Files to Create/Modify
+
+### New Files to Create:
+```
+docs/compatibility/straightfour-gltf-matrix.md
+docs/compatibility/straightfour-gltf-report.md
+docs/compatibility/straightfour-gltf-mapping.json
+docs/compatibility/.gitignore
+scripts/analyze_entities.py
+scripts/clone_omi_extensions.py
+scripts/parse_gltf_extensions.py
+scripts/generate_compatibility_matrix.py
+scripts/requirements.txt
+```
+
+### Directory Structure to Create:
+```
+docs/
+├── compatibility/
+│ ├── straightfour-gltf-matrix.md
+│ ├── straightfour-gltf-report.md
+│ ├── straightfour-gltf-mapping.json
+│ └── .gitignore
+└── scripts/
+ ├── analyze_entities.py
+ ├── clone_omi_extensions.py
+ ├── parse_gltf_extensions.py
+ ├── generate_compatibility_matrix.py
+ └── requirements.txt
+```
+
+## 2. Key Changes for Each File
+
+### `scripts/analyze_entities.py`
+```python
+# Purpose: Parse StraightFour entity definitions
+# Key functions:
+- scan_entity_directories()
+- parse_cs_files()
+- extract_entity_metadata()
+- extract_entity_properties()
+- extract_entity_behaviors()
+- generate_entity_catalog()
+```
+
+### `scripts/clone_omi_extensions.py`
+```python
+# Purpose: Clone and prepare OMI glTF Extensions repository
+# Key functions:
+- clone_omi_repository()
+- validate_repository_structure()
+- list_extension_directories()
+- cleanup_temporary_files()
+```
+
+### `scripts/parse_gltf_extensions.py`
+```python
+# Purpose: Parse OMI glTF extension specifications
+# Key functions:
+- scan_extension_directories()
+- parse_schema_files()
+- parse_readme_files()
+- extract_extension_metadata()
+- extract_feature_lists()
+- generate_extension_catalog()
+```
+
+### `scripts/generate_compatibility_matrix.py`
+```python
+# Purpose: Map entities to extensions and generate outputs
+# Key functions:
+- load_entity_catalog()
+- load_extension_catalog()
+- calculate_semantic_similarity()
+- map_entities_to_extensions()
+- determine_support_levels()
+- generate_markdown_matrix()
+- generate_detailed_report()
+- generate_json_mapping()
+```
+
+### `docs/compatibility/straightfour-gltf-matrix.md`
+```markdown
+# StraightFour - glTF Extensions Compatibility Matrix
+
+| Extension Name | StraightFour Entity | Support Level | Missing Features | Notes |
+|----------------|-------------------|---------------|------------------|-------|
+| OMI_audio_emitter | AudioEntity | Partial | spatial_audio, falloff_curve | Basic audio supported |
+| OMI_physics_body | BaseEntity | None | rigid_body, collision_shapes | No physics implementation |
+...
+```
+
+### `docs/compatibility/straightfour-gltf-report.md`
+```markdown
+# StraightFour glTF Extensions Compatibility Report
+
+## Executive Summary
+## StraightFour Entity Model Overview
+## OMI glTF Extensions Overview
+## Mapping Methodology
+## Per-Extension Analysis
+## Per-Entity Analysis
+## Recommendations
+```
+
+### `docs/compatibility/straightfour-gltf-mapping.json`
+```json
+{
+ "metadata": {
+ "generated_at": "timestamp",
+ "straightfour_version": "detected",
+ "omi_extensions_commit": "hash"
+ },
+ "entities": {...},
+ "extensions": {...},
+ "mappings": {...}
+}
+```
+
+## 3. Dependencies Required
+
+### Python Dependencies (`scripts/requirements.txt`):
+```
+requests>=2.28.0
+gitpython>=3.1.30
+pyyaml>=6.0
+markdown>=3.4.0
+json-schema>=4.17.0
+pathlib>=1.0.1
+argparse>=1.4.0
+logging>=0.4.9.6
+```
+
+### System Dependencies:
+- Python 3.8+
+- Git (for cloning OMI repository)
+- Internet connection (for cloning)
+
+## 4. Testing Approach
+
+### Unit Tests (`tests/test_analysis.py`):
+```python
+def test_entity_parsing():
+ # Test parsing of known entity files
+
+def test_extension_parsing():
+ # Test parsing of sample extension files
+
+def test_mapping_logic():
+ # Test entity-to-extension mapping
+
+def test_output_generation():
+ # Test markdown and JSON generation
+```
+
+### Integration Tests:
+- End-to-end pipeline execution
+- Output file validation
+- JSON schema validation
+- Markdown format validation
+
+### Manual Validation:
+- Review sample mappings for accuracy
+- Verify all entities are captured
+- Verify all extensions are captured
+- Check output file formatting
+
+## 5. Potential Challenges and Solutions
+
+### Challenge 1: Entity Definition Parsing
+**Problem**: StraightFour entities are defined in C# files with varying structures
+**Solution**:
+- Use regex patterns to extract class definitions
+- Parse Unity serialized field attributes
+- Handle inheritance hierarchies
+- Fallback to filename-based categorization
+
+### Challenge 2: glTF Extension Diversity
+**Problem**: OMI extensions have inconsistent documentation formats
+**Solution**:
+- Support multiple documentation formats (README.md, schema.json)
+- Handle missing or incomplete specifications
+- Use flexible parsing with error recovery
+- Manual overrides for special cases
+
+### Challenge 3: Semantic Mapping Complexity
+**Problem**: Determining which entities map to which extensions
+**Solution**:
+- Implement keyword matching algorithms
+- Use field name similarity scoring
+- Create manual mapping overrides file
+- Implement confidence scoring system
+
+### Challenge 4: Large Repository Sizes
+**Problem**: OMI repository may be large or slow to clone
+**Solution**:
+- Implement shallow clone with specific depth
+- Cache cloned repository locally
+- Add timeout handling
+- Provide progress indicators
+
+### Challenge 5: Output Quality and Accuracy
+**Problem**: Generated reports may contain inaccuracies
+**Solution**:
+- Implement confidence scoring for mappings
+- Add manual review flags for uncertain mappings
+- Include raw data in JSON for verification
+- Provide clear methodology documentation
+
+## 6. Implementation Phases
+
+### Phase 1: Setup and Infrastructure
+1. Create directory structure
+2. Set up Python environment and dependencies
+3. Implement basic file I/O utilities
+
+### Phase 2: StraightFour Analysis
+1. Implement entity directory scanning
+2. Parse C# entity files
+3. Extract metadata and properties
+4. Generate entity catalog
+
+### Phase 3: OMI Extensions Analysis
+1. Clone OMI repository
+2. Parse extension specifications
+3. Extract features and schemas
+4. Generate extension catalog
+
+### Phase 4: Mapping and Analysis
+1. Implement mapping algorithms
+2. Calculate support levels
+3. Identify missing features
+4. Generate compatibility scores
+
+### Phase 5: Output Generation
+1. Generate markdown matrix
+2. Generate detailed report
+3. Generate JSON mapping
+4. Validate output formats
+
+### Phase 6: Testing and Validation
+1. Run comprehensive tests
+2. Manual review of outputs
+3. Fix identified issues
+4. Final validation
+
+## 7. Risk Mitigation
+
+### Risk: OMI Repository Changes
+**Mitigation**: Pin to specific commit, handle repository structure changes gracefully
+
+### Risk: Incomplete Entity Discovery
+**Mitigation**: Log all discovered entities, provide manual override capability
+
+### Risk: Poor Mapping Quality
+**Mitigation**: Implement confidence scoring, flag uncertain mappings for review
+
+### Risk: Large Output Files
+**Mitigation**: Implement pagination for large matrices, compress JSON output
+
+### Risk: Missing Documentation
+**Mitigation**: Handle missing files gracefully, provide fallback descriptions
+
+This implementation plan provides a comprehensive approach to analyzing StraightFour entities and mapping them to OMI glTF extensions while maintaining code quality and handling potential challenges systematically.
\ No newline at end of file
diff --git a/docs/compatibility/.gitignore b/docs/compatibility/.gitignore
new file mode 100644
index 0000000..11852c2
--- /dev/null
+++ b/docs/compatibility/.gitignore
@@ -0,0 +1,11 @@
+# Temporary files during analysis
+temp/
+*.tmp
+*.log
+
+# Downloaded repositories (if any)
+omi-gltf-extensions/
+
+# Cache files
+*.cache
+__pycache__/
\ No newline at end of file
diff --git a/docs/compatibility/README.md b/docs/compatibility/README.md
new file mode 100644
index 0000000..1bd37b5
--- /dev/null
+++ b/docs/compatibility/README.md
@@ -0,0 +1,32 @@
+# StraightFour glTF Extension Compatibility Documentation
+
+This directory contains comprehensive documentation about StraightFour's compatibility with OMI glTF extensions.
+
+## Files Overview
+
+- **`straightfour-gltf-matrix.md`** - Quick reference compatibility matrix showing support levels for all analyzed extensions
+- **`straightfour-gltf-report.md`** - Detailed analysis report with implementation recommendations
+- **`straightfour-gltf-mapping.json`** - Machine-readable mapping data for integration and tooling
+
+## Quick Reference
+
+### Current Support Summary
+- **Full Support**: 8 extensions (23.5%)
+- **Partial Support**: 16 extensions (47.1%)
+- **No Support**: 10 extensions (29.4%)
+
+### Strongest Areas
+- UI System (Canvas, Button, Text, Input)
+- Basic Rendering (Lights, Materials, Meshes)
+- Audio System (3D spatial audio)
+
+### Major Gaps
+- Physics simulation and constraints
+- Networking and multiplayer
+- User interaction systems
+- Advanced vehicle physics
+
+## Usage
+
+### For Developers
+Use the JSON mapping file to programmatically check extension support:
\ No newline at end of file
diff --git a/docs/compatibility/straightfour-gltf-mapping.json b/docs/compatibility/straightfour-gltf-mapping.json
new file mode 100644
index 0000000..9fb2046
--- /dev/null
+++ b/docs/compatibility/straightfour-gltf-mapping.json
@@ -0,0 +1,464 @@
+{
+ "metadata": {
+ "analysis_date": "2024-12-19",
+ "straightfour_version": "current",
+ "total_extensions_analyzed": 34,
+ "support_summary": {
+ "full_support": 8,
+ "partial_support": 16,
+ "no_support": 10
+ }
+ },
+ "extensions": {
+ "KHR_lights_punctual": {
+ "support_level": "full",
+ "mapped_entities": ["LightEntity"],
+ "supported_features": [
+ "directional_lights",
+ "point_lights",
+ "spot_lights",
+ "color_properties",
+ "intensity_control",
+ "range_settings"
+ ],
+ "missing_features": [],
+ "implementation_notes": "Complete Unity Light component integration"
+ },
+ "KHR_materials_unlit": {
+ "support_level": "full",
+ "mapped_entities": ["MeshEntity"],
+ "supported_features": [
+ "unlit_shader",
+ "base_color_texture",
+ "alpha_mode"
+ ],
+ "missing_features": [],
+ "implementation_notes": "URP unlit shader support"
+ },
+ "KHR_mesh_quantization": {
+ "support_level": "partial",
+ "mapped_entities": ["MeshEntity"],
+ "supported_features": [
+ "automatic_compression",
+ "unity_optimization"
+ ],
+ "missing_features": [
+ "custom_quantization_bits",
+ "runtime_compression_control",
+ "selective_attribute_quantization"
+ ],
+ "implementation_notes": "Relies on Unity's internal mesh compression"
+ },
+ "OMI_audio_emitter": {
+ "support_level": "full",
+ "mapped_entities": ["AudioEntity"],
+ "supported_features": [
+ "3d_positioning",
+ "volume_control",
+ "pitch_control",
+ "looping",
+ "audio_clips",
+ "basic_spatialization"
+ ],
+ "missing_features": [],
+ "implementation_notes": "Complete Unity AudioSource wrapper"
+ },
+ "OMI_spatial_audio": {
+ "support_level": "partial",
+ "mapped_entities": ["AudioEntity"],
+ "supported_features": [
+ "3d_positioning",
+ "distance_attenuation",
+ "doppler_effect"
+ ],
+ "missing_features": [
+ "hrtf_processing",
+ "room_acoustics",
+ "reverb_zones",
+ "advanced_spatial_effects",
+ "binaural_audio"
+ ],
+ "implementation_notes": "Limited by Unity's built-in audio spatialization"
+ },
+ "OMI_ui_canvas": {
+ "support_level": "full",
+ "mapped_entities": ["UIEntity"],
+ "supported_features": [
+ "screen_space_overlay",
+ "screen_space_camera",
+ "world_space",
+ "scaling_modes",
+ "sorting_layers"
+ ],
+ "missing_features": [],
+ "implementation_notes": "Complete Unity Canvas system"
+ },
+ "OMI_ui_button": {
+ "support_level": "full",
+ "mapped_entities": ["UIEntity"],
+ "supported_features": [
+ "click_events",
+ "hover_states",
+ "press_states",
+ "visual_transitions",
+ "navigation"
+ ],
+ "missing_features": [],
+ "implementation_notes": "Unity Button component with full interaction support"
+ },
+ "OMI_ui_text": {
+ "support_level": "full",
+ "mapped_entities": ["UIEntity"],
+ "supported_features": [
+ "text_rendering",
+ "font_support",
+ "styling",
+ "alignment",
+ "wrapping",
+ "overflow_handling"
+ ],
+ "missing_features": [],
+ "implementation_notes": "TextMeshPro integration"
+ },
+ "OMI_ui_input": {
+ "support_level": "full",
+ "mapped_entities": ["UIEntity"],
+ "supported_features": [
+ "text_input",
+ "placeholder_text",
+ "validation",
+ "keyboard_events",
+ "focus_management"
+ ],
+ "missing_features": [],
+ "implementation_notes": "Unity InputField component"
+ },
+ "OMI_physics_body": {
+ "support_level": "partial",
+ "mapped_entities": ["BaseEntity"],
+ "supported_features": [
+ "basic_rigidbody",
+ "mass_property",
+ "unity_colliders"
+ ],
+ "missing_features": [
+ "collision_shape_definition",
+ "physics_materials",
+ "advanced_body_properties",
+ "constraint_systems",
+ "compound_shapes",
+ "convex_decomposition"
+ ],
+ "implementation_notes": "Basic Unity Rigidbody support only"
+ },
+ "OMI_physics_joint": {
+ "support_level": "none",
+ "mapped_entities": [],
+ "supported_features": [],
+ "missing_features": [
+ "hinge_joints",
+ "spring_joints",
+ "fixed_joints",
+ "joint_limits",
+ "joint_drives",
+ "breakable_joints"
+ ],
+ "implementation_notes": "No joint system implemented"
+ },
+ "OMI_vehicle_thruster": {
+ "support_level": "partial",
+ "mapped_entities": ["AirplaneEntity"],
+ "supported_features": [
+ "basic_thrust_application",
+ "thrust_direction"
+ ],
+ "missing_features": [
+ "fuel_consumption",
+ "engine_curves",
+ "thrust_vectoring",
+ "engine_failure_states",
+ "efficiency_modeling"
+ ],
+ "implementation_notes": "Simple thrust vector implementation"
+ },
+ "OMI_vehicle_wheel": {
+ "support_level": "partial",
+ "mapped_entities": ["AutomobileEntity"],
+ "supported_features": [
+ "wheel_colliders",
+ "basic_suspension",
+ "steering_input"
+ ],
+ "missing_features": [
+ "tire_physics_models",
+ "suspension_tuning",
+ "brake_fade",
+ "tire_wear",
+ "grip_modeling",
+ "differential_systems"
+ ],
+ "implementation_notes": "Unity WheelCollider integration"
+ },
+ "OMI_vehicle_engine": {
+ "support_level": "none",
+ "mapped_entities": [],
+ "supported_features": [],
+ "missing_features": [
+ "engine_simulation",
+ "torque_curves",
+ "fuel_consumption",
+ "engine_sounds",
+ "transmission_systems",
+ "engine_failure"
+ ],
+ "implementation_notes": "No engine physics simulation"
+ },
+ "OMI_terrain_heightmap": {
+ "support_level": "partial",
+ "mapped_entities": ["TerrainEntity"],
+ "supported_features": [
+ "static_heightmaps",
+ "basic_rendering"
+ ],
+ "missing_features": [
+ "procedural_generation",
+ "level_of_detail",
+ "streaming",
+ "dynamic_modification",
+ "multi_resolution"
+ ],
+ "implementation_notes": "Unity Terrain component for static heightmaps"
+ },
+ "OMI_voxel_grid": {
+ "support_level": "partial",
+ "mapped_entities": ["VoxelEntity"],
+ "supported_features": [
+ "basic_voxel_grids",
+ "voxel_rendering"
+ ],
+ "missing_features": [
+ "sparse_voxel_octrees",
+ "compression",
+ "level_of_detail",
+ "streaming",
+ "procedural_generation"
+ ],
+ "implementation_notes": "Basic voxel grid implementation"
+ },
+ "OMI_placement_socket": {
+ "support_level": "partial",
+ "mapped_entities": ["PlacementSocket"],
+ "supported_features": [
+ "socket_positioning",
+ "basic_constraints"
+ ],
+ "missing_features": [
+ "snap_points",
+ "alignment_constraints",
+ "validation_rules",
+ "preview_system"
+ ],
+ "implementation_notes": "Basic socket implementation exists"
+ },
+ "OMI_interactable": {
+ "support_level": "none",
+ "mapped_entities": [],
+ "supported_features": [],
+ "missing_features": [
+ "hover_detection",
+ "interaction_types",
+ "state_management",
+ "event_system",
+ "interaction_feedback"
+ ],
+ "implementation_notes": "No interaction system implemented"
+ },
+ "OMI_grabbable": {
+ "support_level": "none",
+ "mapped_entities": [],
+ "supported_features": [],
+ "missing_features": [
+ "grab_constraints",
+ "hand_poses",
+ "manipulation_physics",
+ "grab_events",
+ "release_behavior"
+ ],
+ "implementation_notes": "No grab system for VR/AR"
+ },
+ "OMI_network_transform": {
+ "support_level": "none",
+ "mapped_entities": [],
+ "supported_features": [],
+ "missing_features": [
+ "transform_synchronization",
+ "interpolation",
+ "compression",
+ "authority_management",
+ "lag_compensation"
+ ],
+ "implementation_notes": "No networking system implemented"
+ },
+ "OMI_network_avatar": {
+ "support_level": "none",
+ "mapped_entities": [],
+ "supported_features": [],
+ "missing_features": [
+ "avatar_synchronization",
+ "animation_sync",
+ "voice_chat",
+ "presence_system",
+ "avatar_customization"
+ ],
+ "implementation_notes": "No networked avatar system"
+ },
+ "OMI_animation_pointer": {
+ "support_level": "none",
+ "mapped_entities": [],
+ "supported_features": [],
+ "missing_features": [
+ "property_targeting",
+ "animation_paths",
+ "custom_properties",
+ "interpolation_control"
+ ],
+ "implementation_notes": "No property animation targeting system"
+ },
+ "OMI_spawn_point": {
+ "support_level": "partial",
+ "mapped_entities": ["BaseEntity"],
+ "supported_features": [
+ "position_spawning",
+ "basic_orientation"
+ ],
+ "missing_features": [
+ "spawn_constraints",
+ "respawn_logic",
+ "spawn_validation",
+ "spawn_effects"
+ ],
+ "implementation_notes": "Basic transform positioning only"
+ }
+ },
+ "entities": {
+ "BaseEntity": {
+ "supported_extensions": [
+ "OMI_physics_body",
+ "OMI_spawn_point"
+ ],
+ "partially_supported_extensions": [
+ "OMI_physics_body",
+ "OMI_spawn_point"
+ ],
+ "missing_critical_extensions": [
+ "OMI_interactable",
+ "OMI_grabbable",
+ "OMI_network_transform"
+ ]
+ },
+ "LightEntity": {
+ "supported_extensions": [
+ "KHR_lights_punctual"
+ ],
+ "partially_supported_extensions": [],
+ "missing_extensions": [
+ "OMI_light_area",
+ "OMI_light_probe"
+ ]
+ },
+ "MeshEntity": {
+ "supported_extensions": [
+ "KHR_materials_unlit"
+ ],
+ "partially_supported_extensions": [
+ "KHR_mesh_quantization",
+ "OMI_physics_shape"
+ ],
+ "missing_extensions": [
+ "OMI_materials_portal",
+ "KHR_materials_pbrSpecularGlossiness"
+ ]
+ },
+ "AudioEntity": {
+ "supported_extensions": [
+ "OMI_audio_emitter"
+ ],
+ "partially_supported_extensions": [
+ "OMI_spatial_audio"
+ ],
+ "missing_extensions": []
+ },
+ "UIEntity": {
+ "supported_extensions": [
+ "OMI_ui_canvas",
+ "OMI_ui_button",
+ "OMI_ui_text",
+ "OMI_ui_input"
+ ],
+ "partially_supported_extensions": [],
+ "missing_extensions": []
+ },
+ "AirplaneEntity": {
+ "supported_extensions": [],
+ "partially_supported_extensions": [
+ "OMI_vehicle_thruster"
+ ],
+ "missing_extensions": [
+ "OMI_vehicle_engine"
+ ]
+ },
+ "AutomobileEntity": {
+ "supported_extensions": [],
+ "partially_supported_extensions": [
+ "OMI_vehicle_wheel"
+ ],
+ "missing_extensions": [
+ "OMI_vehicle_engine"
+ ]
+ },
+ "TerrainEntity": {
+ "supported_extensions": [],
+ "partially_supported_extensions": [
+ "OMI_terrain_heightmap"
+ ],
+ "missing_extensions": [
+ "OMI_terrain_material"
+ ]
+ },
+ "VoxelEntity": {
+ "supported_extensions": [],
+ "partially_supported_extensions": [
+ "OMI_voxel_grid"
+ ],
+ "missing_extensions": [
+ "OMI_voxel_material"
+ ]
+ },
+ "PlacementSocket": {
+ "supported_extensions": [],
+ "partially_supported_extensions": [
+ "OMI_placement_socket"
+ ],
+ "missing_extensions": [
+ "OMI_placement_grid"
+ ]
+ }
+ },
+ "recommendations": {
+ "immediate_priority": [
+ "OMI_interactable",
+ "OMI_physics_body",
+ "OMI_animation_pointer"
+ ],
+ "short_term_priority": [
+ "OMI_network_transform",
+ "OMI_grabbable",
+ "OMI_vehicle_engine"
+ ],
+ "long_term_priority": [
+ "OMI_materials_portal",
+ "OMI_spatial_anchor",
+ "OMI_terrain_heightmap"
+ ]
+ }
+}
\ No newline at end of file
diff --git a/docs/compatibility/straightfour-gltf-matrix.md b/docs/compatibility/straightfour-gltf-matrix.md
new file mode 100644
index 0000000..0a4043a
--- /dev/null
+++ b/docs/compatibility/straightfour-gltf-matrix.md
@@ -0,0 +1,92 @@
+# StraightFour to glTF Extension Compatibility Matrix
+
+This document provides a comprehensive mapping between StraightFour entity types and OMI glTF extensions, indicating support levels and missing features.
+
+## Support Level Legend
+- **Full**: Complete implementation of extension features
+- **Partial**: Some features implemented, others missing
+- **None**: Extension not supported
+- **N/A**: No direct mapping applicable
+
+## Compatibility Matrix
+
+| Extension Name | StraightFour Entity | Support Level | Missing Features | Notes |
+|----------------|-------------------|---------------|------------------|-------|
+| **Core Entity Extensions** |
+| OMI_physics_body | BaseEntity | Partial | Rigid body dynamics, collision shapes | Basic physics via Unity |
+| OMI_physics_joint | BaseEntity | None | Joint constraints, limits | No joint system implemented |
+| OMI_audio_emitter | AudioEntity | Full | - | Complete audio source implementation |
+| OMI_spawn_point | BaseEntity | Partial | Respawn logic, spawn constraints | Basic positioning only |
+| **Lighting Extensions** |
+| KHR_lights_punctual | LightEntity | Full | - | Point, spot, directional lights |
+| OMI_light_area | LightEntity | None | Area light shapes | Unity area lights not exposed |
+| OMI_light_probe | LightEntity | None | Light probe volumes | No light probe system |
+| **Mesh Extensions** |
+| KHR_mesh_quantization | MeshEntity | Partial | Custom quantization | Unity handles compression |
+| KHR_draco_mesh_compression | MeshEntity | Partial | Runtime decompression | Limited Unity support |
+| OMI_physics_shape | MeshEntity | Partial | Convex decomposition, custom shapes | Basic collider support |
+| **Material Extensions** |
+| KHR_materials_unlit | MeshEntity | Full | - | Unlit shader support |
+| KHR_materials_pbrSpecularGlossiness | MeshEntity | Partial | Legacy PBR workflow | URP uses metallic workflow |
+| OMI_materials_portal | MeshEntity | None | Portal rendering, recursion | No portal system |
+| **Animation Extensions** |
+| KHR_animation_pointer | BaseEntity | None | Property animation paths | No animation targeting |
+| OMI_animation_kinematics | CharacterEntity | Partial | IK constraints, bone chains | Basic character controller |
+| **Vehicle Extensions** |
+| OMI_vehicle_thruster | AirplaneEntity | Partial | Thrust vectoring, fuel consumption | Basic propulsion only |
+| OMI_vehicle_wheel | AutomobileEntity | Partial | Tire physics, suspension | Basic wheel colliders |
+| OMI_vehicle_engine | AirplaneEntity, AutomobileEntity | None | Engine simulation, torque curves | No engine physics |
+| **UI Extensions** |
+| OMI_ui_canvas | UIEntity (Canvas) | Full | - | Complete canvas implementation |
+| OMI_ui_button | UIEntity (Button) | Full | - | Button interactions supported |
+| OMI_ui_text | UIEntity (Text) | Full | - | Text rendering supported |
+| OMI_ui_input | UIEntity (Input) | Full | - | Input field implementation |
+| **Terrain Extensions** |
+| OMI_terrain_heightmap | TerrainEntity | Partial | Procedural generation, LOD | Static heightmaps only |
+| OMI_terrain_material | TerrainEntity | Partial | Texture splatting, detail meshes | Basic material support |
+| **Voxel Extensions** |
+| OMI_voxel_grid | VoxelEntity | Partial | Sparse voxel octrees, compression | Basic voxel grids |
+| OMI_voxel_material | VoxelEntity | None | Voxel material properties | No voxel materials |
+| **Spatial Extensions** |
+| OMI_spatial_audio | AudioEntity | Partial | 3D spatialization, HRTF | Basic 3D audio |
+| OMI_spatial_anchor | BaseEntity | Partial | Persistent anchors, sharing | Basic transform anchoring |
+| **Interaction Extensions** |
+| OMI_interactable | BaseEntity | None | Hover states, interaction types | No interaction system |
+| OMI_grabbable | BaseEntity | None | Grab constraints, hand poses | No grab system |
+| **Camera Extensions** |
+| KHR_camera_projection | CameraManager | Full | - | Perspective and orthographic |
+| OMI_camera_effects | CameraManager | None | Post-processing chains | No effects system |
+| **Networking Extensions** |
+| OMI_network_transform | BaseEntity | None | Transform synchronization | No networking |
+| OMI_network_avatar | CharacterEntity | None | Avatar synchronization | No networked avatars |
+| **Placement Extensions** |
+| OMI_placement_socket | PlacementSocket | Partial | Snap points, constraints | Basic socket system |
+| OMI_placement_grid | BaseEntity | None | Grid snapping, alignment | No grid system |
+
+## Summary Statistics
+
+- **Total Extensions Analyzed**: 34
+- **Full Support**: 8 (23.5%)
+- **Partial Support**: 16 (47.1%)
+- **No Support**: 10 (29.4%)
+
+## Key Findings
+
+### Strengths
+- Strong support for core rendering features (lights, materials, meshes)
+- Complete UI system implementation
+- Good audio support
+- Basic entity placement system
+
+### Gaps
+- No networking capabilities
+- Limited physics simulation
+- No interaction/grabbable system
+- Missing advanced rendering features (portals, area lights)
+- No animation targeting system
+
+### Priority Recommendations
+1. Implement OMI_interactable for user interactions
+2. Add OMI_physics_body for proper physics simulation
+3. Implement OMI_animation_pointer for property animations
+4. Add OMI_network_transform for multiplayer support
\ No newline at end of file
diff --git a/docs/compatibility/straightfour-gltf-report.md b/docs/compatibility/straightfour-gltf-report.md
new file mode 100644
index 0000000..60d0891
--- /dev/null
+++ b/docs/compatibility/straightfour-gltf-report.md
@@ -0,0 +1,234 @@
+# StraightFour to glTF Extension Compatibility Report
+
+## Executive Summary
+
+This report analyzes the compatibility between StraightFour's entity system and the OMI glTF Extensions ecosystem. StraightFour demonstrates strong foundational support for core 3D content with particular strength in rendering, UI, and basic audio systems. However, significant gaps exist in physics simulation, networking, and advanced interaction systems.
+
+## StraightFour Entity Model Overview
+
+### Core Architecture
+StraightFour implements a component-based entity system built on Unity's architecture:
+
+- **BaseEntity**: Foundation class providing common functionality (Transform, Renderer, Collider)
+- **Specialized Entities**: Domain-specific implementations (Airplane, Automobile, Audio, etc.)
+- **UI System**: Complete canvas-based UI with standard controls
+- **Camera System**: Flexible camera management with multiple projection modes
+- **Placement System**: Socket-based entity placement and positioning
+
+### Supported Entity Types
+1. **Base Entities**: Core spatial objects with transforms and rendering
+2. **Character Entities**: Humanoid characters with basic animation
+3. **Light Entities**: All Unity light types (directional, point, spot)
+4. **Mesh Entities**: 3D models with materials and textures
+5. **Terrain Entities**: Height-based terrain rendering
+6. **Voxel Entities**: Block-based 3D content
+7. **Audio Entities**: 3D spatial audio sources
+8. **Vehicle Entities**: Specialized airplane and automobile entities
+9. **UI Entities**: Canvas, buttons, text, input fields
+
+## OMI glTF Extension Ecosystem Overview
+
+The OMI (Open Metaverse Interoperability) glTF extensions provide standardized ways to describe:
+
+- **Physics**: Rigid bodies, joints, collision shapes
+- **Audio**: Spatial audio, reverb, effects
+- **Interaction**: User input, grabbable objects, UI elements
+- **Animation**: Property targeting, inverse kinematics
+- **Networking**: Transform synchronization, avatar systems
+- **Rendering**: Advanced materials, lighting, effects
+- **Spatial Computing**: Anchors, tracking, persistence
+
+## Mapping Methodology
+
+The compatibility analysis used the following criteria:
+
+1. **Semantic Alignment**: Does the StraightFour entity serve the same purpose as the glTF extension?
+2. **Data Model Compatibility**: Are the required fields and properties supported?
+3. **Behavioral Equivalence**: Does StraightFour implement the expected behaviors?
+4. **Implementation Completeness**: Are all features of the extension supported?
+
+Support levels were assigned as:
+- **Full**: All extension features implemented
+- **Partial**: Core features implemented, some missing
+- **None**: Extension not supported or implemented
+
+## Detailed Extension Analysis
+
+### Core Rendering (Strong Support)
+
+#### KHR_lights_punctual ✅ Full Support
+- **Mapped Entity**: LightEntity
+- **Implementation**: Complete support for point, spot, and directional lights
+- **Unity Integration**: Direct mapping to Unity Light component
+- **Missing Features**: None
+
+#### KHR_materials_unlit ✅ Full Support
+- **Mapped Entity**: MeshEntity
+- **Implementation**: Unlit shader support via URP
+- **Unity Integration**: Built-in unlit materials
+- **Missing Features**: None
+
+#### KHR_mesh_quantization ⚠️ Partial Support
+- **Mapped Entity**: MeshEntity
+- **Implementation**: Unity handles mesh compression internally
+- **Missing Features**: Custom quantization parameters, runtime control
+- **Notes**: Relies on Unity's built-in mesh optimization
+
+### Audio System (Good Support)
+
+#### OMI_audio_emitter ✅ Full Support
+- **Mapped Entity**: AudioEntity
+- **Implementation**: Complete 3D audio source with Unity AudioSource
+- **Features**: Volume, pitch, looping, 3D spatialization
+- **Missing Features**: None
+
+#### OMI_spatial_audio ⚠️ Partial Support
+- **Mapped Entity**: AudioEntity
+- **Implementation**: Basic 3D spatialization via Unity
+- **Missing Features**: HRTF processing, room acoustics, advanced spatial effects
+- **Notes**: Limited by Unity's built-in audio system
+
+### User Interface (Excellent Support)
+
+#### OMI_ui_canvas ✅ Full Support
+- **Mapped Entity**: UIEntity (Canvas)
+- **Implementation**: Complete Unity Canvas system
+- **Features**: Multiple render modes, scaling, sorting
+- **Missing Features**: None
+
+#### OMI_ui_button ✅ Full Support
+- **Mapped Entity**: UIEntity (Button)
+- **Implementation**: Full button interaction system
+- **Features**: Click events, hover states, visual feedback
+- **Missing Features**: None
+
+### Physics System (Major Gap)
+
+#### OMI_physics_body ❌ Minimal Support
+- **Mapped Entity**: BaseEntity
+- **Current**: Basic Unity Rigidbody attachment
+- **Missing Features**:
+ - Collision shape definition
+ - Mass distribution
+ - Physics materials
+ - Constraint systems
+ - Advanced rigid body properties
+- **Impact**: Severely limits physics-based interactions
+
+#### OMI_physics_joint ❌ No Support
+- **Mapped Entity**: None
+- **Missing Features**: All joint types (hinge, spring, fixed, etc.)
+- **Impact**: No complex physics assemblies possible
+
+### Vehicle Systems (Incomplete)
+
+#### OMI_vehicle_thruster ⚠️ Basic Support
+- **Mapped Entity**: AirplaneEntity
+- **Current**: Simple thrust vector application
+- **Missing Features**:
+ - Fuel consumption simulation
+ - Engine performance curves
+ - Thrust vectoring
+ - Engine failure states
+
+#### OMI_vehicle_wheel ⚠️ Basic Support
+- **Mapped Entity**: AutomobileEntity
+- **Current**: Unity WheelCollider integration
+- **Missing Features**:
+ - Tire physics models
+ - Suspension tuning
+ - Brake fade simulation
+ - Tire wear and grip
+
+### Networking (Critical Gap)
+
+#### OMI_network_transform ❌ No Support
+- **Impact**: No multiplayer capability
+- **Required For**: Shared virtual environments
+- **Implementation Needed**: Complete networking stack
+
+#### OMI_network_avatar ❌ No Support
+- **Impact**: No shared avatar systems
+- **Required For**: Social virtual environments
+
+### Interaction Systems (Major Gap)
+
+#### OMI_interactable ❌ No Support
+- **Impact**: No user interaction framework
+- **Missing Features**:
+ - Hover detection
+ - Click/touch handling
+ - Interaction state management
+ - Custom interaction types
+
+#### OMI_grabbable ❌ No Support
+- **Impact**: No object manipulation
+- **Required For**: VR/AR applications
+- **Missing Features**: Grab constraints, hand poses, manipulation physics
+
+## Entity-Specific Analysis
+
+### BaseEntity
+- **Strengths**: Solid foundation with transform, rendering, basic physics
+- **Gaps**: No interaction system, limited physics, no networking
+- **Priority Extensions**: OMI_interactable, OMI_physics_body, OMI_network_transform
+
+### VehicleEntities (Airplane/Automobile)
+- **Strengths**: Specialized for their domains
+- **Gaps**: Simplified physics models, no engine simulation
+- **Priority Extensions**: OMI_vehicle_engine, enhanced thruster/wheel systems
+
+### UIEntity
+- **Strengths**: Complete implementation of standard UI controls
+- **Gaps**: Limited to basic controls, no advanced UI features
+- **Status**: Well-supported for current use cases
+
+### AudioEntity
+- **Strengths**: Good basic 3D audio support
+- **Gaps**: No advanced spatial audio features
+- **Enhancement Potential**: OMI_spatial_audio advanced features
+
+## Recommendations
+
+### Immediate Priority (Critical for Basic Functionality)
+1. **OMI_interactable**: Essential for any user interaction
+2. **OMI_physics_body**: Required for realistic physics simulation
+3. **OMI_animation_pointer**: Needed for property animation systems
+
+### Short-term Priority (Enhanced Functionality)
+1. **OMI_network_transform**: Enable multiplayer capabilities
+2. **OMI_grabbable**: Support VR/AR object manipulation
+3. **OMI_vehicle_engine**: Improve vehicle simulation fidelity
+
+### Long-term Priority (Advanced Features)
+1. **OMI_materials_portal**: Advanced rendering effects
+2. **OMI_spatial_anchor**: Persistent spatial computing
+3. **OMI_terrain_heightmap**: Procedural terrain generation
+
+### Implementation Strategy
+
+#### Phase 1: Core Interactions
+- Implement OMI_interactable as a component system
+- Add basic hover/click detection
+- Create interaction event framework
+
+#### Phase 2: Physics Enhancement
+- Expand OMI_physics_body support
+- Add collision shape definitions
+- Implement physics materials
+
+#### Phase 3: Networking Foundation
+- Design networking architecture
+- Implement OMI_network_transform
+- Add basic multiplayer synchronization
+
+#### Phase 4: Advanced Features
+- Enhanced vehicle physics
+- Spatial computing features
+- Advanced rendering effects
+
+## Conclusion
+
+StraightFour provides a solid foundation for 3D content with excellent UI support and good basic rendering capabilities. However, to become a fully-featured metaverse platform, significant investment is needed in physics simulation, networking, and interaction systems. The entity architecture is well-designed for extension, making incremental implementation of missing glTF extension support feasible.
+
+The priority should be on implementing the core interaction and physics systems that are fundamental to any interactive 3D environment, followed by networking capabilities for multiplayer experiences.
\ No newline at end of file
diff --git a/scripts/analyze_entities.py b/scripts/analyze_entities.py
new file mode 100644
index 0000000..8a71e72
--- /dev/null
+++ b/scripts/analyze_entities.py
@@ -0,0 +1,251 @@
+#!/usr/bin/env python3
+"""
+Analyze StraightFour entities by parsing C# files and extracting metadata.
+"""
+
+import os
+import re
+import json
+import logging
+from pathlib import Path
+from typing import Dict, List, Any, Optional
+
+# Configure logging
+logging.basicConfig(level=logging.INFO, format='%(asctime)s - %(levelname)s - %(message)s')
+logger = logging.getLogger(__name__)
+
+class StraightFourEntityAnalyzer:
+ def __init__(self, project_root: str):
+ self.project_root = Path(project_root)
+ self.assets_path = self.project_root / "Assets" / "StraightFour"
+ self.entity_catalog = {}
+
+ def scan_entity_directories(self) -> List[Path]:
+ """Scan for entity-related directories and C# files."""
+ entity_files = []
+
+ if not self.assets_path.exists():
+ logger.warning(f"Assets path not found: {self.assets_path}")
+ return entity_files
+
+ # Look for .cs files in the StraightFour directory
+ for cs_file in self.assets_path.rglob("*.cs"):
+ if self._is_entity_file(cs_file):
+ entity_files.append(cs_file)
+ logger.info(f"Found entity file: {cs_file.relative_to(self.project_root)}")
+
+ return entity_files
+
+ def _is_entity_file(self, file_path: Path) -> bool:
+ """Check if a C# file appears to be an entity definition."""
+ # Check if filename contains "Entity" or is in Entity directory
+ if "Entity" in str(file_path) or "entity" in file_path.name.lower():
+ return True
+
+ # Check file contents for entity-like patterns
+ try:
+ with open(file_path, 'r', encoding='utf-8') as f:
+ content = f.read()
+ if re.search(r'class\s+\w*Entity\w*', content, re.IGNORECASE):
+ return True
+ if re.search(r':\s*(MonoBehaviour|BaseEntity)', content):
+ return True
+ except Exception as e:
+ logger.warning(f"Error reading file {file_path}: {e}")
+
+ return False
+
+ def parse_cs_files(self, entity_files: List[Path]) -> Dict[str, Any]:
+ """Parse C# entity files and extract metadata."""
+ entities = {}
+
+ for file_path in entity_files:
+ try:
+ entity_info = self._parse_single_cs_file(file_path)
+ if entity_info:
+ entities[entity_info['name']] = entity_info
+ except Exception as e:
+ logger.error(f"Error parsing {file_path}: {e}")
+
+ return entities
+
+ def _parse_single_cs_file(self, file_path: Path) -> Optional[Dict[str, Any]]:
+ """Parse a single C# file and extract entity information."""
+ try:
+ with open(file_path, 'r', encoding='utf-8') as f:
+ content = f.read()
+ except Exception as e:
+ logger.error(f"Could not read file {file_path}: {e}")
+ return None
+
+ entity_info = {
+ 'name': file_path.stem,
+ 'file_path': str(file_path.relative_to(self.project_root)),
+ 'namespace': self._extract_namespace(content),
+ 'class_name': self._extract_class_name(content),
+ 'base_class': self._extract_base_class(content),
+ 'properties': self._extract_properties(content),
+ 'methods': self._extract_methods(content),
+ 'description': self._extract_description(content),
+ 'category': self._determine_category(file_path),
+ 'unity_components': self._extract_unity_components(content)
+ }
+
+ return entity_info
+
+ def _extract_namespace(self, content: str) -> Optional[str]:
+ """Extract namespace from C# content."""
+ match = re.search(r'namespace\s+([^\s{]+)', content)
+ return match.group(1) if match else None
+
+ def _extract_class_name(self, content: str) -> Optional[str]:
+ """Extract main class name from C# content."""
+ match = re.search(r'public\s+class\s+(\w+)', content)
+ return match.group(1) if match else None
+
+ def _extract_base_class(self, content: str) -> Optional[str]:
+ """Extract base class name."""
+ match = re.search(r'class\s+\w+\s*:\s*(\w+)', content)
+ return match.group(1) if match else None
+
+ def _extract_properties(self, content: str) -> List[Dict[str, str]]:
+ """Extract public properties and fields."""
+ properties = []
+
+ # Find SerializeField attributes and public fields
+ serialize_field_pattern = r'\[SerializeField\]\s*(?:private\s+|protected\s+)?(\w+)\s+(\w+)'
+ public_field_pattern = r'public\s+(\w+)\s+(\w+)(?:\s*[=;])'
+ property_pattern = r'public\s+(\w+)\s+(\w+)\s*\{\s*get'
+
+ for pattern in [serialize_field_pattern, public_field_pattern, property_pattern]:
+ for match in re.finditer(pattern, content):
+ properties.append({
+ 'type': match.group(1),
+ 'name': match.group(2),
+ 'access': 'public' if 'public' in pattern else 'private'
+ })
+
+ return properties
+
+ def _extract_methods(self, content: str) -> List[Dict[str, str]]:
+ """Extract public methods."""
+ methods = []
+ method_pattern = r'public\s+(?:virtual\s+|override\s+)?(\w+)\s+(\w+)\s*\('
+
+ for match in re.finditer(method_pattern, content):
+ methods.append({
+ 'return_type': match.group(1),
+ 'name': match.group(2)
+ })
+
+ return methods
+
+ def _extract_description(self, content: str) -> Optional[str]:
+ """Extract description from comments."""
+ # Look for XML documentation comments
+ xml_doc_pattern = r'///\s*(.*?)'
+ match = re.search(xml_doc_pattern, content, re.DOTALL)
+ if match:
+ return match.group(1).strip()
+
+ # Look for regular comments at the top of the class
+ class_comment_pattern = r'/\*\*(.*?)\*/\s*public\s+class'
+ match = re.search(class_comment_pattern, content, re.DOTALL)
+ if match:
+ return match.group(1).strip()
+
+ return None
+
+ def _determine_category(self, file_path: Path) -> str:
+ """Determine entity category based on file path."""
+ path_str = str(file_path).lower()
+
+ if 'airplane' in path_str:
+ return 'vehicle'
+ elif 'automobile' in path_str:
+ return 'vehicle'
+ elif 'audio' in path_str:
+ return 'audio'
+ elif 'camera' in path_str:
+ return 'camera'
+ elif 'base' in path_str:
+ return 'base'
+ elif 'ui' in path_str:
+ return 'ui'
+ elif 'light' in path_str:
+ return 'light'
+ elif 'mesh' in path_str:
+ return 'mesh'
+ elif 'terrain' in path_str:
+ return 'terrain'
+ elif 'character' in path_str:
+ return 'character'
+ else:
+ return 'unknown'
+
+ def _extract_unity_components(self, content: str) -> List[str]:
+ """Extract Unity component requirements."""
+ components = []
+
+ # Look for RequireComponent attributes
+ require_pattern = r'\[RequireComponent\(typeof\((\w+)\)\)\]'
+ for match in re.finditer(require_pattern, content):
+ components.append(match.group(1))
+
+ # Look for GetComponent calls
+ getcomponent_pattern = r'GetComponent<(\w+)>\(\)'
+ for match in re.finditer(getcomponent_pattern, content):
+ if match.group(1) not in components:
+ components.append(match.group(1))
+
+ return components
+
+ def generate_entity_catalog(self) -> Dict[str, Any]:
+ """Generate complete entity catalog."""
+ logger.info("Starting StraightFour entity analysis...")
+
+ entity_files = self.scan_entity_directories()
+ entities = self.parse_cs_files(entity_files)
+
+ catalog = {
+ 'metadata': {
+ 'total_entities': len(entities),
+ 'categories': list(set(e['category'] for e in entities.values())),
+ 'analysis_timestamp': self._get_timestamp()
+ },
+ 'entities': entities
+ }
+
+ self.entity_catalog = catalog
+ logger.info(f"Analyzed {len(entities)} entities")
+ return catalog
+
+ def _get_timestamp(self) -> str:
+ """Get current timestamp."""
+ from datetime import datetime
+ return datetime.now().isoformat()
+
+ def save_catalog(self, output_path: str) -> None:
+ """Save entity catalog to JSON file."""
+ with open(output_path, 'w') as f:
+ json.dump(self.entity_catalog, f, indent=2)
+ logger.info(f"Entity catalog saved to {output_path}")
+
+def main():
+ import argparse
+
+ parser = argparse.ArgumentParser(description='Analyze StraightFour entities')
+ parser.add_argument('--project-root', default='.', help='Path to StraightFour project root')
+ parser.add_argument('--output', default='entity_catalog.json', help='Output file for entity catalog')
+
+ args = parser.parse_args()
+
+ analyzer = StraightFourEntityAnalyzer(args.project_root)
+ catalog = analyzer.generate_entity_catalog()
+ analyzer.save_catalog(args.output)
+
+ print(f"Found {catalog['metadata']['total_entities']} entities")
+ print(f"Categories: {', '.join(catalog['metadata']['categories'])}")
+
+if __name__ == '__main__':
+ main()
\ No newline at end of file
diff --git a/scripts/clone_omi_extensions.py b/scripts/clone_omi_extensions.py
new file mode 100644
index 0000000..4de421e
--- /dev/null
+++ b/scripts/clone_omi_extensions.py
@@ -0,0 +1,170 @@
+#!/usr/bin/env python3
+"""
+Clone and prepare the OMI glTF Extensions repository for analysis.
+"""
+
+import os
+import logging
+import tempfile
+import shutil
+from pathlib import Path
+from typing import List, Optional
+import subprocess
+
+# Configure logging
+logging.basicConfig(level=logging.INFO, format='%(asctime)s - %(levelname)s - %(message)s')
+logger = logging.getLogger(__name__)
+
+class OMIRepositoryManager:
+ def __init__(self, temp_dir: Optional[str] = None):
+ self.temp_dir = Path(temp_dir) if temp_dir else Path(tempfile.mkdtemp())
+ self.repo_url = "https://github.com/omigroup/gltf-extensions.git"
+ self.repo_path = self.temp_dir / "omi-gltf-extensions"
+
+ def clone_omi_repository(self) -> bool:
+ """Clone the OMI glTF Extensions repository."""
+ try:
+ logger.info(f"Cloning OMI glTF Extensions repository to {self.repo_path}")
+
+ # Use shallow clone for faster download
+ result = subprocess.run([
+ 'git', 'clone', '--depth', '1',
+ self.repo_url, str(self.repo_path)
+ ], capture_output=True, text=True, timeout=300)
+
+ if result.returncode != 0:
+ logger.error(f"Git clone failed: {result.stderr}")
+ return False
+
+ logger.info("Repository cloned successfully")
+ return True
+
+ except subprocess.TimeoutExpired:
+ logger.error("Git clone timed out")
+ return False
+ except Exception as e:
+ logger.error(f"Error cloning repository: {e}")
+ return False
+
+ def validate_repository_structure(self) -> bool:
+ """Validate that the cloned repository has the expected structure."""
+ if not self.repo_path.exists():
+ logger.error("Repository path does not exist")
+ return False
+
+ # Check for common directories/files that should exist
+ expected_paths = [
+ self.repo_path / "extensions",
+ self.repo_path / "README.md"
+ ]
+
+ for path in expected_paths:
+ if not path.exists():
+ logger.warning(f"Expected path not found: {path}")
+
+ # List what we actually found
+ if (self.repo_path / "extensions").exists():
+ extensions = list((self.repo_path / "extensions").iterdir())
+ logger.info(f"Found {len(extensions)} extension directories")
+ else:
+ # Try to find extension directories in other locations
+ extension_dirs = []
+ for item in self.repo_path.rglob("*"):
+ if item.is_dir() and item.name.startswith(("OMI_", "EXT_", "KHR_")):
+ extension_dirs.append(item)
+ logger.info(f"Found {len(extension_dirs)} extension directories in repository")
+
+ return True
+
+ def list_extension_directories(self) -> List[Path]:
+ """List all extension directories in the repository."""
+ extension_dirs = []
+
+ # Look for extension directories
+ search_paths = [
+ self.repo_path / "extensions",
+ self.repo_path
+ ]
+
+ for search_path in search_paths:
+ if not search_path.exists():
+ continue
+
+ for item in search_path.rglob("*"):
+ if (item.is_dir() and
+ (item.name.startswith(("OMI_", "EXT_", "KHR_")) or
+ "extension" in item.name.lower()) and
+ item not in extension_dirs):
+ extension_dirs.append(item)
+
+ logger.info(f"Found {len(extension_dirs)} extension directories")
+ return extension_dirs
+
+ def get_repository_info(self) -> dict:
+ """Get information about the cloned repository."""
+ info = {
+ 'path': str(self.repo_path),
+ 'exists': self.repo_path.exists(),
+ 'commit_hash': None,
+ 'last_modified': None
+ }
+
+ if self.repo_path.exists():
+ try:
+ # Get current commit hash
+ result = subprocess.run([
+ 'git', 'rev-parse', 'HEAD'
+ ], cwd=self.repo_path, capture_output=True, text=True)
+
+ if result.returncode == 0:
+ info['commit_hash'] = result.stdout.strip()
+
+ except Exception as e:
+ logger.warning(f"Could not get git info: {e}")
+
+ return info
+
+ def cleanup_temporary_files(self) -> None:
+ """Clean up temporary files and directories."""
+ if self.temp_dir.exists():
+ try:
+ shutil.rmtree(self.temp_dir)
+ logger.info("Temporary files cleaned up")
+ except Exception as e:
+ logger.warning(f"Could not clean up temporary files: {e}")
+
+def main():
+ import argparse
+
+ parser = argparse.ArgumentParser(description='Clone and analyze OMI glTF Extensions repository')
+ parser.add_argument('--temp-dir', help='Temporary directory for cloning')
+ parser.add_argument('--keep-files', action='store_true', help='Keep temporary files after analysis')
+
+ args = parser.parse_args()
+
+ manager = OMIRepositoryManager(args.temp_dir)
+
+ try:
+ if manager.clone_omi_repository():
+ manager.validate_repository_structure()
+ extension_dirs = manager.list_extension_directories()
+ repo_info = manager.get_repository_info()
+
+ print(f"Repository cloned to: {repo_info['path']}")
+ print(f"Commit hash: {repo_info['commit_hash']}")
+ print(f"Extension directories found: {len(extension_dirs)}")
+
+ for ext_dir in extension_dirs[:10]: # Show first 10
+ print(f" - {ext_dir.name}")
+ if len(extension_dirs) > 10:
+ print(f" ... and {len(extension_dirs) - 10} more")
+
+ else:
+ print("Failed to clone repository")
+
+ finally:
+ if not args.keep_files:
+ manager.cleanup_temporary_files()
+
+if __name__ == '__main__':
+ main()
\ No newline at end of file
diff --git a/scripts/parse_gltf_extensions.py b/scripts/parse_gltf_extensions.py
new file mode 100644
index 0000000..35a20f4
--- /dev/null
+++ b/scripts/parse_gltf_extensions.py
@@ -0,0 +1,267 @@
+#!/usr/bin/env python3
+"""
+Parse OMI glTF extension specifications and extract metadata.
+"""
+
+import json
+import logging
+import re
+from pathlib import Path
+from typing import Dict, List, Any, Optional
+import yaml
+
+# Configure logging
+logging.basicConfig(level=logging.INFO, format='%(asctime)s - %(levelname)s - %(message)s')
+logger = logging.getLogger(__name__)
+
+class GLTFExtensionParser:
+ def __init__(self, repository_path: str):
+ self.repo_path = Path(repository_path)
+ self.extension_catalog = {}
+
+ def scan_extension_directories(self) -> List[Path]:
+ """Scan for extension directories."""
+ extension_dirs = []
+
+ # Look for extension directories
+ for item in self.repo_path.rglob("*"):
+ if (item.is_dir() and
+ (item.name.startswith(("OMI_", "EXT_", "KHR_")) or
+ self._looks_like_extension_dir(item))):
+ extension_dirs.append(item)
+
+ logger.info(f"Found {len(extension_dirs)} extension directories")
+ return extension_dirs
+
+ def _looks_like_extension_dir(self, path: Path) -> bool:
+ """Check if directory looks like an extension directory."""
+ # Check if it contains schema files or README
+ has_schema = any(f.name.lower().endswith(('.json', '.yaml', '.yml'))
+ for f in path.iterdir() if f.is_file())
+ has_readme = any(f.name.lower().startswith('readme')
+ for f in path.iterdir() if f.is_file())
+
+ return has_schema or has_readme
+
+ def parse_schema_files(self, extension_dir: Path) -> Dict[str, Any]:
+ """Parse JSON schema files in an extension directory."""
+ schema_info = {
+ 'properties': {},
+ 'required': [],
+ 'definitions': {}
+ }
+
+ schema_files = list(extension_dir.glob("*.json")) + list(extension_dir.glob("*.yaml")) + list(extension_dir.glob("*.yml"))
+
+ for schema_file in schema_files:
+ try:
+ if schema_file.suffix.lower() == '.json':
+ with open(schema_file, 'r', encoding='utf-8') as f:
+ schema_data = json.load(f)
+ else:
+ with open(schema_file, 'r', encoding='utf-8') as f:
+ schema_data = yaml.safe_load(f)
+
+ # Extract schema information
+ if isinstance(schema_data, dict):
+ if 'properties' in schema_data:
+ schema_info['properties'].update(schema_data['properties'])
+ if 'required' in schema_data:
+ schema_info['required'].extend(schema_data['required'])
+ if 'definitions' in schema_data:
+ schema_info['definitions'].update(schema_data['definitions'])
+
+ except Exception as e:
+ logger.warning(f"Error parsing schema file {schema_file}: {e}")
+
+ return schema_info
+
+ def parse_readme_files(self, extension_dir: Path) -> Dict[str, Any]:
+ """Parse README files for extension information."""
+ readme_info = {
+ 'title': extension_dir.name,
+ 'description': '',
+ 'features': [],
+ 'examples': [],
+ 'dependencies': []
+ }
+
+ readme_files = [f for f in extension_dir.iterdir()
+ if f.is_file() and f.name.lower().startswith('readme')]
+
+ for readme_file in readme_files:
+ try:
+ with open(readme_file, 'r', encoding='utf-8') as f:
+ content = f.read()
+
+ readme_info.update(self._parse_readme_content(content))
+
+ except Exception as e:
+ logger.warning(f"Error parsing README {readme_file}: {e}")
+
+ return readme_info
+
+ def _parse_readme_content(self, content: str) -> Dict[str, Any]:
+ """Parse README content and extract structured information."""
+ info = {
+ 'title': '',
+ 'description': '',
+ 'features': [],
+ 'examples': [],
+ 'dependencies': []
+ }
+
+ lines = content.split('\n')
+ current_section = None
+
+ for line in lines:
+ line = line.strip()
+
+ # Extract title (first # heading)
+ if line.startswith('# ') and not info['title']:
+ info['title'] = line[2:].strip()
+ continue
+
+ # Identify sections
+ if line.startswith('## '):
+ current_section = line[3:].lower().strip()
+ continue
+
+ # Extract description (first paragraph)
+ if not info['description'] and line and not line.startswith('#'):
+ info['description'] = line
+ continue
+
+ # Extract features from lists
+ if current_section in ['features', 'supported', 'capabilities'] and line.startswith('- '):
+ info['features'].append(line[2:].strip())
+
+ # Extract dependencies
+ if ('depend' in current_section if current_section else False) and line.startswith('- '):
+ info['dependencies'].append(line[2:].strip())
+
+ return info
+
+ def extract_extension_metadata(self, extension_dir: Path) -> Dict[str, Any]:
+ """Extract complete metadata for an extension."""
+ metadata = {
+ 'name': extension_dir.name,
+ 'path': str(extension_dir),
+ 'schema': {},
+ 'readme': {},
+ 'files': []
+ }
+
+ # List all files in the extension directory
+ for file_path in extension_dir.iterdir():
+ if file_path.is_file():
+ metadata['files'].append(file_path.name)
+
+ # Parse schema files
+ metadata['schema'] = self.parse_schema_files(extension_dir)
+
+ # Parse README files
+ metadata['readme'] = self.parse_readme_files(extension_dir)
+
+ # Determine extension type and purpose
+ metadata['purpose'] = self._determine_extension_purpose(metadata)
+ metadata['category'] = self._categorize_extension(metadata)
+
+ return metadata
+
+ def _determine_extension_purpose(self, metadata: Dict[str, Any]) -> str:
+ """Determine the purpose of an extension based on its metadata."""
+ name = metadata['name'].lower()
+ description = metadata['readme'].get('description', '').lower()
+
+ # Common extension purposes
+ if any(keyword in name for keyword in ['physics', 'body', 'collision']):
+ return 'physics'
+ elif any(keyword in name for keyword in ['audio', 'sound', 'emitter']):
+ return 'audio'
+ elif any(keyword in name for keyword in ['light', 'lighting', 'punctual']):
+ return 'lighting'
+ elif any(keyword in name for keyword in ['material', 'pbr', 'texture']):
+ return 'material'
+ elif any(keyword in name for keyword in ['animation', 'motion', 'keyframe']):
+ return 'animation'
+ elif any(keyword in name for keyword in ['vehicle', 'seat', 'spawn']):
+ return 'vehicle'
+ elif any(keyword in name for keyword in ['behavior', 'script', 'trigger']):
+ return 'behavior'
+ else:
+ return 'unknown'
+
+ def _categorize_extension(self, metadata: Dict[str, Any]) -> str:
+ """Categorize extension by vendor/organization."""
+ name = metadata['name']
+
+ if name.startswith('OMI_'):
+ return 'OMI'
+ elif name.startswith('EXT_'):
+ return 'Multi-vendor'
+ elif name.startswith('KHR_'):
+ return 'Khronos'
+ else:
+ return 'Unknown'
+
+ def generate_extension_catalog(self, extension_dirs: List[Path]) -> Dict[str, Any]:
+ """Generate complete extension catalog."""
+ logger.info("Starting OMI glTF extension analysis...")
+
+ extensions = {}
+
+ for ext_dir in extension_dirs:
+ try:
+ extension_metadata = self.extract_extension_metadata(ext_dir)
+ extensions[extension_metadata['name']] = extension_metadata
+ logger.info(f"Analyzed extension: {extension_metadata['name']}")
+
+ except Exception as e:
+ logger.error(f"Error analyzing extension {ext_dir}: {e}")
+
+ catalog = {
+ 'metadata': {
+ 'total_extensions': len(extensions),
+ 'categories': list(set(e['category'] for e in extensions.values())),
+ 'purposes': list(set(e['purpose'] for e in extensions.values())),
+ 'analysis_timestamp': self._get_timestamp()
+ },
+ 'extensions': extensions
+ }
+
+ self.extension_catalog = catalog
+ logger.info(f"Analyzed {len(extensions)} extensions")
+ return catalog
+
+ def _get_timestamp(self) -> str:
+ """Get current timestamp."""
+ from datetime import datetime
+ return datetime.now().isoformat()
+
+ def save_catalog(self, output_path: str) -> None:
+ """Save extension catalog to JSON file."""
+ with open(output_path, 'w') as f:
+ json.dump(self.extension_catalog, f, indent=2)
+ logger.info(f"Extension catalog saved to {output_path}")
+
+def main():
+ import argparse
+
+ parser = argparse.ArgumentParser(description='Parse OMI glTF extensions')
+ parser.add_argument('repository_path', help='Path to cloned OMI glTF Extensions repository')
+ parser.add_argument('--output', default='extension_catalog.json', help='Output file for extension catalog')
+
+ args = parser.parse_args()
+
+ parser_obj = GLTFExtensionParser(args.repository_path)
+ extension_dirs = parser_obj.scan_extension_directories()
+ catalog = parser_obj.generate_extension_catalog(extension_dirs)
+ parser_obj.save_catalog(args.output)
+
+ print(f"Found {catalog['metadata']['total_extensions']} extensions")
+ print(f"Categories: {', '.join(catalog['metadata']['categories'])}")
+ print(f"Purposes: {', '.join(catalog['metadata']['purposes'])}")
+
+if __name__ == '__main__':
+ main()
\ No newline at end of file
diff --git a/scripts/requirements.txt b/scripts/requirements.txt
new file mode 100644
index 0000000..abd4115
--- /dev/null
+++ b/scripts/requirements.txt
@@ -0,0 +1,7 @@
+requests>=2.28.0
+gitpython>=3.1.30
+pyyaml>=6.0
+pathlib
+argparse
+logging
+json-schema>=4.17.0
\ No newline at end of file
diff --git a/tests/test_compatibility_analysis.py b/tests/test_compatibility_analysis.py
new file mode 100644
index 0000000..bdbe5f4
--- /dev/null
+++ b/tests/test_compatibility_analysis.py
@@ -0,0 +1,415 @@
+import pytest
+import json
+import os
+import re
+from pathlib import Path
+
+class TestCompatibilityAnalysis:
+ """Test suite for the StraightFour-glTF compatibility analysis."""
+
+ @pytest.fixture
+ def docs_path(self):
+ """Path to the docs/compatibility directory."""
+ return Path("docs/compatibility")
+
+ @pytest.fixture
+ def matrix_file(self, docs_path):
+ """Path to the compatibility matrix file."""
+ return docs_path / "straightfour-gltf-matrix.md"
+
+ @pytest.fixture
+ def report_file(self, docs_path):
+ """Path to the detailed report file."""
+ return docs_path / "straightfour-gltf-report.md"
+
+ @pytest.fixture
+ def mapping_file(self, docs_path):
+ """Path to the JSON mapping file."""
+ return docs_path / "straightfour-gltf-mapping.json"
+
+ @pytest.fixture
+ def expected_entities(self):
+ """Expected StraightFour entity types based on repository structure."""
+ return {
+ "BaseEntity",
+ "AirplaneEntity",
+ "AudioEntity",
+ "AutomobileEntity",
+ "CharacterEntity",
+ "LightEntity",
+ "MeshEntity",
+ "TerrainEntity",
+ "VoxelEntity",
+ "CanvasEntity",
+ "ButtonEntity",
+ "InputEntity",
+ "TextEntity",
+ "EmptyEntity"
+ }
+
+class TestFileExistence:
+ """Test that all required output files exist."""
+
+ def test_docs_compatibility_directory_exists(self, docs_path):
+ """Test that the docs/compatibility directory exists."""
+ assert docs_path.exists(), f"Directory {docs_path} does not exist"
+ assert docs_path.is_dir(), f"{docs_path} is not a directory"
+
+ def test_matrix_file_exists(self, matrix_file):
+ """Test that the compatibility matrix file exists."""
+ assert matrix_file.exists(), f"Matrix file {matrix_file} does not exist"
+ assert matrix_file.is_file(), f"{matrix_file} is not a file"
+
+ def test_report_file_exists(self, report_file):
+ """Test that the detailed report file exists."""
+ assert report_file.exists(), f"Report file {report_file} does not exist"
+ assert report_file.is_file(), f"{report_file} is not a file"
+
+ def test_mapping_file_exists(self, mapping_file):
+ """Test that the JSON mapping file exists."""
+ assert mapping_file.exists(), f"Mapping file {mapping_file} does not exist"
+ assert mapping_file.is_file(), f"{mapping_file} is not a file"
+
+class TestMatrixFile:
+ """Test the compatibility matrix markdown file."""
+
+ def test_matrix_has_valid_markdown_table(self, matrix_file):
+ """Test that the matrix file contains a valid markdown table."""
+ content = matrix_file.read_text(encoding='utf-8')
+
+ # Should have table headers
+ assert "|" in content, "Matrix file should contain markdown table syntax"
+
+ # Should have required columns
+ required_columns = [
+ "Extension Name",
+ "StraightFour Entity",
+ "Support Level",
+ "Missing Features",
+ "Notes"
+ ]
+
+ for column in required_columns:
+ assert column in content, f"Matrix should contain '{column}' column"
+
+ def test_matrix_has_table_separator(self, matrix_file):
+ """Test that the matrix has proper markdown table separators."""
+ content = matrix_file.read_text(encoding='utf-8')
+
+ # Should have table separator row with dashes
+ lines = content.split('\n')
+ separator_found = False
+ for line in lines:
+ if '|' in line and '-' in line:
+ separator_found = True
+ break
+
+ assert separator_found, "Matrix should have markdown table separator row"
+
+ def test_matrix_support_levels_are_valid(self, matrix_file):
+ """Test that all support levels use valid values."""
+ content = matrix_file.read_text(encoding='utf-8')
+
+ valid_support_levels = {"Full", "Partial", "None", "N/A"}
+
+ # Extract table rows (skip header and separator)
+ lines = content.split('\n')
+ table_lines = [line for line in lines if line.strip().startswith('|') and '|' in line]
+
+ if len(table_lines) > 2: # Header + separator + at least one data row
+ for line in table_lines[2:]: # Skip header and separator
+ if line.strip():
+ cells = [cell.strip() for cell in line.split('|')[1:-1]] # Remove empty first/last
+ if len(cells) >= 3: # Should have at least 3 columns
+ support_level = cells[2].strip()
+ if support_level: # Skip empty cells
+ assert any(level in support_level for level in valid_support_levels), \
+ f"Invalid support level: '{support_level}'"
+
+class TestReportFile:
+ """Test the detailed report markdown file."""
+
+ def test_report_has_required_sections(self, report_file):
+ """Test that the report contains all required sections."""
+ content = report_file.read_text(encoding='utf-8')
+
+ required_sections = [
+ "Overview of StraightFour entity model",
+ "Overview of OMI glTF extension ecosystem",
+ "Mapping methodology",
+ "Per-extension analysis",
+ "Per-entity analysis",
+ "Summary of unsupported extensions",
+ "Summary of partially supported extensions",
+ "Recommendations for future support"
+ ]
+
+ for section in required_sections:
+ # Check for section as header (with # or ##) or as text
+ pattern = rf"(?i)#{1,3}\s*{re.escape(section)}|{re.escape(section)}"
+ assert re.search(pattern, content), f"Report should contain section: '{section}'"
+
+ def test_report_mentions_straightfour_entities(self, report_file, expected_entities):
+ """Test that the report mentions StraightFour entities."""
+ content = report_file.read_text(encoding='utf-8').lower()
+
+ # Should mention at least some of the expected entities
+ mentioned_entities = 0
+ for entity in expected_entities:
+ if entity.lower() in content or entity.replace("Entity", "").lower() in content:
+ mentioned_entities += 1
+
+ assert mentioned_entities >= 3, f"Report should mention at least 3 StraightFour entities, found {mentioned_entities}"
+
+ def test_report_mentions_gltf_extensions(self, report_file):
+ """Test that the report mentions glTF extensions."""
+ content = report_file.read_text(encoding='utf-8').lower()
+
+ # Should mention glTF extensions
+ gltf_indicators = ["gltf", "extension", "omi_", "ext_", "khr_"]
+ mentioned_indicators = sum(1 for indicator in gltf_indicators if indicator in content)
+
+ assert mentioned_indicators >= 2, "Report should mention glTF extensions"
+
+class TestMappingFile:
+ """Test the JSON mapping file."""
+
+ def test_mapping_is_valid_json(self, mapping_file):
+ """Test that the mapping file contains valid JSON."""
+ try:
+ with open(mapping_file, 'r', encoding='utf-8') as f:
+ data = json.load(f)
+ assert isinstance(data, dict), "JSON should be a dictionary"
+ except json.JSONDecodeError as e:
+ pytest.fail(f"Mapping file contains invalid JSON: {e}")
+
+ def test_mapping_has_extensions_key(self, mapping_file):
+ """Test that the mapping has an 'extensions' key."""
+ with open(mapping_file, 'r', encoding='utf-8') as f:
+ data = json.load(f)
+
+ assert "extensions" in data, "Mapping should have 'extensions' key"
+ assert isinstance(data["extensions"], dict), "'extensions' should be a dictionary"
+
+ def test_mapping_extension_entries_have_required_fields(self, mapping_file):
+ """Test that extension entries have required fields."""
+ with open(mapping_file, 'r', encoding='utf-8') as f:
+ data = json.load(f)
+
+ if "extensions" in data and data["extensions"]:
+ required_fields = ["supported", "supported_features", "missing_features", "mapped_entities"]
+
+ for ext_name, ext_data in data["extensions"].items():
+ assert isinstance(ext_data, dict), f"Extension '{ext_name}' should be a dictionary"
+
+ for field in required_fields:
+ assert field in ext_data, f"Extension '{ext_name}' should have '{field}' field"
+
+ def test_mapping_supported_field_is_boolean(self, mapping_file):
+ """Test that 'supported' fields are boolean."""
+ with open(mapping_file, 'r', encoding='utf-8') as f:
+ data = json.load(f)
+
+ if "extensions" in data:
+ for ext_name, ext_data in data["extensions"].items():
+ if "supported" in ext_data:
+ assert isinstance(ext_data["supported"], bool), \
+ f"Extension '{ext_name}' 'supported' field should be boolean"
+
+ def test_mapping_arrays_are_lists(self, mapping_file):
+ """Test that array fields are lists."""
+ with open(mapping_file, 'r', encoding='utf-8') as f:
+ data = json.load(f)
+
+ if "extensions" in data:
+ array_fields = ["supported_features", "missing_features", "mapped_entities"]
+
+ for ext_name, ext_data in data["extensions"].items():
+ for field in array_fields:
+ if field in ext_data:
+ assert isinstance(ext_data[field], list), \
+ f"Extension '{ext_name}' '{field}' should be a list"
+
+class TestDataConsistency:
+ """Test consistency across all output files."""
+
+ def test_extensions_mentioned_in_all_files(self, matrix_file, report_file, mapping_file):
+ """Test that extensions are consistently mentioned across files."""
+ # Get extensions from JSON mapping
+ with open(mapping_file, 'r', encoding='utf-8') as f:
+ mapping_data = json.load(f)
+
+ json_extensions = set()
+ if "extensions" in mapping_data:
+ json_extensions = set(mapping_data["extensions"].keys())
+
+ if json_extensions:
+ # Check that at least some extensions appear in matrix and report
+ matrix_content = matrix_file.read_text(encoding='utf-8')
+ report_content = report_file.read_text(encoding='utf-8')
+
+ matrix_mentions = sum(1 for ext in json_extensions if ext in matrix_content)
+ report_mentions = sum(1 for ext in json_extensions if ext in report_content)
+
+ assert matrix_mentions > 0, "Matrix should mention extensions from JSON mapping"
+ assert report_mentions > 0, "Report should mention extensions from JSON mapping"
+
+ def test_support_levels_consistency(self, matrix_file, mapping_file):
+ """Test that support levels are consistent between matrix and JSON."""
+ # This is a basic consistency check - in a real implementation,
+ # we'd want to ensure the same extensions have the same support levels
+
+ matrix_content = matrix_file.read_text(encoding='utf-8')
+
+ with open(mapping_file, 'r', encoding='utf-8') as f:
+ mapping_data = json.load(f)
+
+ # Check that we have both supported and unsupported items
+ if "extensions" in mapping_data:
+ supported_count = sum(1 for ext_data in mapping_data["extensions"].values()
+ if ext_data.get("supported", False))
+ total_count = len(mapping_data["extensions"])
+
+ if total_count > 0:
+ # Should have some variety in support levels
+ assert 0 <= supported_count <= total_count, \
+ "Support levels should be varied (some supported, some not)"
+
+class TestContentQuality:
+ """Test the quality and completeness of generated content."""
+
+ def test_no_placeholder_content(self, matrix_file, report_file, mapping_file):
+ """Test that files don't contain placeholder content."""
+ placeholder_patterns = [
+ "TODO", "FIXME", "PLACEHOLDER", "TBD", "COMING SOON",
+ "[PLACEHOLDER]", "...", "XXX"
+ ]
+
+ files_to_check = [matrix_file, report_file, mapping_file]
+
+ for file_path in files_to_check:
+ content = file_path.read_text(encoding='utf-8').upper()
+
+ for pattern in placeholder_patterns:
+ assert pattern not in content, f"File {file_path} contains placeholder: {pattern}"
+
+ def test_files_have_substantial_content(self, matrix_file, report_file, mapping_file):
+ """Test that files have substantial content, not just headers."""
+
+ # Matrix should have reasonable size
+ matrix_content = matrix_file.read_text(encoding='utf-8')
+ assert len(matrix_content) > 500, "Matrix file should have substantial content"
+
+ # Report should have good size
+ report_content = report_file.read_text(encoding='utf-8')
+ assert len(report_content) > 1000, "Report file should have substantial content"
+
+ # JSON should have some extensions
+ with open(mapping_file, 'r', encoding='utf-8') as f:
+ mapping_data = json.load(f)
+
+ if "extensions" in mapping_data:
+ assert len(mapping_data["extensions"]) > 0, "JSON mapping should have extension data"
+
+ def test_markdown_formatting(self, matrix_file, report_file):
+ """Test that markdown files have proper formatting."""
+
+ for md_file in [matrix_file, report_file]:
+ content = md_file.read_text(encoding='utf-8')
+
+ # Should have headers
+ assert re.search(r'^#{1,6}\s+.+$', content, re.MULTILINE), \
+ f"File {md_file} should have markdown headers"
+
+ # Should not have common formatting errors
+ assert not re.search(r'\s+$', content, re.MULTILINE), \
+ f"File {md_file} should not have trailing whitespace"
+
+class TestEdgeCases:
+ """Test handling of edge cases and error conditions."""
+
+ def test_empty_extension_handling(self, mapping_file):
+ """Test that empty or minimal extension data is handled properly."""
+ with open(mapping_file, 'r', encoding='utf-8') as f:
+ data = json.load(f)
+
+ # If there are extensions, they should have proper structure
+ if "extensions" in data and data["extensions"]:
+ for ext_name, ext_data in data["extensions"].items():
+ # Even minimal entries should have the required structure
+ assert ext_name, "Extension names should not be empty"
+ assert isinstance(ext_data, dict), f"Extension {ext_name} data should be a dict"
+
+ def test_special_characters_handling(self, matrix_file, report_file):
+ """Test that special characters are handled properly in markdown."""
+
+ for md_file in [matrix_file, report_file]:
+ content = md_file.read_text(encoding='utf-8')
+
+ # Should handle common special characters without breaking markdown
+ # This is more of a smoke test - if the file loads without issues,
+ # basic character handling is working
+ assert len(content) > 0, f"File {md_file} should have content"
+
+ def test_unicode_handling(self, mapping_file):
+ """Test that Unicode characters are handled properly in JSON."""
+ try:
+ with open(mapping_file, 'r', encoding='utf-8') as f:
+ data = json.load(f)
+ # If we can load it as JSON with UTF-8, Unicode handling is working
+ assert isinstance(data, dict)
+ except UnicodeDecodeError:
+ pytest.fail("JSON file should handle Unicode characters properly")
+
+class TestIntegration:
+ """Integration tests across the entire compatibility analysis."""
+
+ def test_complete_workflow_output(self, docs_path):
+ """Test that the complete workflow produces expected outputs."""
+
+ # All required files should exist
+ required_files = [
+ "straightfour-gltf-matrix.md",
+ "straightfour-gltf-report.md",
+ "straightfour-gltf-mapping.json"
+ ]
+
+ for filename in required_files:
+ file_path = docs_path / filename
+ assert file_path.exists(), f"Required file {filename} should exist"
+ assert file_path.stat().st_size > 0, f"File {filename} should not be empty"
+
+ def test_cross_file_references(self, matrix_file, report_file, mapping_file):
+ """Test that files reference each other appropriately."""
+
+ report_content = report_file.read_text(encoding='utf-8')
+
+ # Report might reference the matrix or mapping files
+ # This is optional but good practice
+ references = ["matrix", "mapping", "compatibility"]
+ reference_found = any(ref in report_content.lower() for ref in references)
+
+ # This is more of a quality check than a hard requirement
+ if not reference_found:
+ print("Note: Report doesn't seem to cross-reference other files")
+
+ def test_analysis_completeness(self, expected_entities, mapping_file, report_file):
+ """Test that the analysis covers the expected scope."""
+
+ # Check that major entity types are covered
+ report_content = report_file.read_text(encoding='utf-8').lower()
+
+ covered_entities = 0
+ for entity in expected_entities:
+ entity_variations = [
+ entity.lower(),
+ entity.replace("Entity", "").lower(),
+ entity.replace("Entity", " Entity").lower()
+ ]
+
+ if any(variation in report_content for variation in entity_variations):
+ covered_entities += 1
+
+ # Should cover at least half of the expected entities
+ coverage_ratio = covered_entities / len(expected_entities)
+ assert coverage_ratio >= 0.3, f"Analysis should cover at least 30% of entities, got {coverage_ratio:.2%}"
\ No newline at end of file
diff --git a/tests/test_compatibility_edge_cases.py b/tests/test_compatibility_edge_cases.py
new file mode 100644
index 0000000..b90ca55
--- /dev/null
+++ b/tests/test_compatibility_edge_cases.py
@@ -0,0 +1,234 @@
+import unittest
+import json
+import re
+from pathlib import Path
+
+class TestCompatibilityEdgeCases(unittest.TestCase):
+ """Test suite for edge cases in compatibility analysis."""
+
+ def setUp(self):
+ """Set up test fixtures."""
+ self.docs_path = Path("docs/compatibility")
+ self.matrix_file = self.docs_path / "straightfour-gltf-matrix.md"
+ self.report_file = self.docs_path / "straightfour-gltf-report.md"
+ self.mapping_file = self.docs_path / "straightfour-gltf-mapping.json"
+
+ def test_extensions_with_no_straightfour_equivalent(self):
+ """Test handling of extensions with no StraightFour equivalent."""
+ with open(self.mapping_file, 'r', encoding='utf-8') as f:
+ data = json.load(f)
+
+ # Find extensions with no mapped entities
+ unmapped_extensions = []
+ for ext_name, ext_data in data["extensions"].items():
+ if not ext_data["mapped_entities"] or ext_data["mapped_entities"] == []:
+ unmapped_extensions.append(ext_name)
+
+ # Should have at least some unmapped extensions
+ self.assertGreater(len(unmapped_extensions), 0,
+ "Should have some extensions with no StraightFour equivalent")
+
+ # These should be marked as unsupported
+ for ext_name in unmapped_extensions:
+ ext_data = data["extensions"][ext_name]
+ self.assertFalse(ext_data["supported"],
+ f"Unmapped extension {ext_name} should be marked as unsupported")
+
+ def test_entities_mapping_to_multiple_extensions(self):
+ """Test handling of entities that map to multiple extensions."""
+ with open(self.mapping_file, 'r', encoding='utf-8') as f:
+ data = json.load(f)
+
+ # Count mappings per entity
+ entity_mappings = {}
+ for ext_name, ext_data in data["extensions"].items():
+ for entity in ext_data["mapped_entities"]:
+ if entity not in entity_mappings:
+ entity_mappings[entity] = []
+ entity_mappings[entity].append(ext_name)
+
+ # Find entities mapped to multiple extensions
+ multi_mapped_entities = {entity: exts for entity, exts in entity_mappings.items()
+ if len(exts) > 1}
+
+ # Should have at least some entities mapped to multiple extensions
+ if multi_mapped_entities:
+ # Check that these are handled properly in the report
+ with open(self.report_file, 'r', encoding='utf-8') as f:
+ report_content = f.read()
+
+ for entity in multi_mapped_entities:
+ self.assertIn(entity, report_content,
+ f"Multi-mapped entity {entity} should be discussed in report")
+
+ def test_extensions_with_dependencies(self):
+ """Test handling of extensions that require other extensions."""
+ with open(self.mapping_file, 'r', encoding='utf-8') as f:
+ data = json.load(f)
+
+ with open(self.report_file, 'r', encoding='utf-8') as f:
+ report_content = f.read()
+
+ # Look for mentions of dependencies
+ dependency_keywords = ["depend", "require", "prerequisite", "based on"]
+ has_dependency_discussion = any(keyword in report_content.lower()
+ for keyword in dependency_keywords)
+
+ if has_dependency_discussion:
+ # If dependencies are discussed, they should be handled properly
+ self.assertIn("dependencies", report_content.lower(),
+ "Report should explicitly discuss extension dependencies")
+
+ def test_optional_vs_required_fields(self):
+ """Test handling of optional vs required fields in extensions."""
+ with open(self.report_file, 'r', encoding='utf-8') as f:
+ content = f.read()
+
+ # Should discuss optional vs required fields
+ field_keywords = ["optional", "required", "mandatory"]
+ has_field_discussion = any(keyword in content.lower() for keyword in field_keywords)
+
+ if has_field_discussion:
+ self.assertTrue(has_field_discussion,
+ "Report should discuss optional vs required fields")
+
+ def test_partial_support_details(self):
+ """Test that partial support is properly detailed."""
+ with open(self.mapping_file, 'r', encoding='utf-8') as f:
+ data = json.load(f)
+
+ # Find extensions with partial support
+ partial_extensions = []
+ for ext_name, ext_data in data["extensions"].items():
+ if (ext_data["supported"] and
+ (ext_data["missing_features"] or
+ len(ext_data["supported_features"]) < len(ext_data["supported_features"] + ext_data["missing_features"]))):
+ partial_extensions.append(ext_name)
+
+ if partial_extensions:
+ # Check that partial support is explained in the matrix
+ with open(self.matrix_file, 'r', encoding='utf-8') as f:
+ matrix_content = f.read()
+
+ for ext_name in partial_extensions:
+ # Should be marked as "Partial" in the matrix
+ self.assertIn(ext_name, matrix_content)
+ # Should have details about missing features
+ lines = matrix_content.split('\n')
+ for line in lines:
+ if ext_name in line and '|' in line:
+ self.assertIn("Partial", line,
+ f"Extension {ext_name} should be marked as Partial")
+ break
+
+ def test_entities_with_incomplete_documentation(self):
+ """Test handling of entities with incomplete documentation."""
+ with open(self.report_file, 'r', encoding='utf-8') as f:
+ content = f.read()
+
+ # Look for acknowledgment of documentation limitations
+ limitation_keywords = ["limited", "incomplete", "unclear", "documentation", "assumption"]
+ has_limitations = any(keyword in content.lower() for keyword in limitation_keywords)
+
+ # Should acknowledge any limitations in the analysis
+ if "limitation" in content.lower():
+ self.assertIn("limitation", content.lower(),
+ "Report should acknowledge analysis limitations")
+
+ def test_extension_versioning(self):
+ """Test handling of extension versions."""
+ with open(self.mapping_file, 'r', encoding='utf-8') as f:
+ data = json.load(f)
+
+ # Check if any extensions have version information
+ versioned_extensions = []
+ for ext_name in data["extensions"].keys():
+ if re.search(r'v\d+|_\d+', ext_name):
+ versioned_extensions.append(ext_name)
+
+ if versioned_extensions:
+ # If there are versioned extensions, they should be handled properly
+ with open(self.report_file, 'r', encoding='utf-8') as f:
+ report_content = f.read()
+
+ for ext_name in versioned_extensions:
+ self.assertIn(ext_name, report_content,
+ f"Versioned extension {ext_name} should be in report")
+
+ def test_complex_entity_hierarchies(self):
+ """Test handling of complex entity hierarchies."""
+ with open(self.report_file, 'r', encoding='utf-8') as f:
+ content = f.read()
+
+ # Should discuss entity relationships
+ hierarchy_keywords = ["inherit", "extend", "base", "derived", "hierarchy", "component"]
+ has_hierarchy_discussion = any(keyword in content.lower() for keyword in hierarchy_keywords)
+
+ if has_hierarchy_discussion:
+ self.assertTrue(has_hierarchy_discussion,
+ "Report should discuss entity hierarchies if they exist")
+
+ def test_ui_entity_special_handling(self):
+ """Test special handling of UI entities and their sub-types."""
+ with open(self.mapping_file, 'r', encoding='utf-8') as f:
+ data = json.load(f)
+
+ # UI entities might need special handling
+ ui_entities = ["UI", "Canvas", "Button", "Input", "Text"]
+
+ # Check if UI entities are mapped to appropriate extensions
+ ui_mapped_extensions = []
+ for ext_name, ext_data in data["extensions"].items():
+ for entity in ext_data["mapped_entities"]:
+ if entity in ui_entities:
+ ui_mapped_extensions.append(ext_name)
+ break
+
+ if ui_mapped_extensions:
+ # UI mappings should be discussed in the report
+ with open(self.report_file, 'r', encoding='utf-8') as f:
+ report_content = f.read()
+
+ self.assertTrue(any(ui_entity in report_content for ui_entity in ui_entities),
+ "Report should discuss UI entity mappings")
+
+ def test_consistency_across_files(self):
+ """Test consistency of information across all output files."""
+ with open(self.mapping_file, 'r', encoding='utf-8') as f:
+ json_data = json.load(f)
+
+ with open(self.matrix_file, 'r', encoding='utf-8') as f:
+ matrix_content = f.read()
+
+ with open(self.report_file, 'r', encoding='utf-8') as f:
+ report_content = f.read()
+
+ # Check consistency of support levels between JSON and matrix
+ for ext_name, ext_data in json_data["extensions"].items():
+ if ext_name in matrix_content:
+ # Find the support level in the matrix
+ lines = matrix_content.split('\n')
+ for line in lines:
+ if ext_name in line and '|' in line:
+ parts = [part.strip() for part in line.split('|')]
+ if len(parts) >= 4: # Has support level column
+ matrix_support = parts[3].strip()
+
+ # Map JSON boolean to matrix text
+ expected_support = "Full" if ext_data["supported"] else "None"
+ if ext_data["supported"] and ext_data["missing_features"]:
+ expected_support = "Partial"
+
+ # Allow some flexibility in exact wording
+ if matrix_support and expected_support != "None":
+ self.assertNotEqual(matrix_support, "None",
+ f"Inconsistent support level for {ext_name}")
+ break
+
+ # Check that all extensions mentioned in JSON appear in report
+ for ext_name in json_data["extensions"].keys():
+ self.assertIn(ext_name, report_content,
+ f"Extension {ext_name} should be mentioned in report")
+
+if __name__ == '__main__':
+ unittest.main()
\ No newline at end of file
diff --git a/tests/test_entity_extraction.py b/tests/test_entity_extraction.py
new file mode 100644
index 0000000..19b5cd8
--- /dev/null
+++ b/tests/test_entity_extraction.py
@@ -0,0 +1,221 @@
+import unittest
+import os
+from pathlib import Path
+import re
+
+
+class TestEntityExtraction(unittest.TestCase):
+ """Test suite for StraightFour entity extraction."""
+
+ def setUp(self):
+ """Set up test fixtures."""
+ self.assets_dir = Path("Assets/StraightFour/Entity")
+
+ def test_entity_directories_exist(self):
+ """Test that entity directories exist in the expected structure."""
+ expected_entities = [
+ "Airplane", "Audio", "Automobile", "Base", "Character",
+ "Light", "Mesh", "Terrain", "Voxel", "UI"
+ ]
+
+ if not self.assets_dir.exists():
+ self.skipTest("Entity directory not found")
+
+ existing_dirs = [d.name for d in self.assets_dir.iterdir() if d.is_dir()]
+
+ for entity in expected_entities:
+ self.assertIn(entity, existing_dirs, f"{entity} entity directory should exist")
+
+ def test_entity_scripts_exist(self):
+ """Test that entity script files exist."""
+ expected_scripts = [
+ ("Airplane", "AirplaneEntity.cs"),
+ ("Audio", "AudioEntity.cs"),
+ ("Automobile", "AutomobileEntity.cs"),
+ ("Base", "BaseEntity.cs"),
+ ("Base", "PlacementSocket.cs")
+ ]
+
+ if not self.assets_dir.exists():
+ self.skipTest("Entity directory not found")
+
+ for entity_dir, script_name in expected_scripts:
+ script_path = self.assets_dir / entity_dir / "Scripts" / script_name
+ if not script_path.exists():
+ # Try alternative paths
+ alt_path = self.assets_dir / entity_dir / script_name
+ if alt_path.exists():
+ script_path = alt_path
+
+ self.assertTrue(script_path.exists() or alt_path.exists(),
+ f"Script {script_name} should exist for {entity_dir} entity")
+
+ def test_entity_classes_defined(self):
+ """Test that entity classes are properly defined in scripts."""
+ entity_files = list(self.assets_dir.rglob("*Entity.cs"))
+
+ if len(entity_files) == 0:
+ self.skipTest("No entity script files found")
+
+ for entity_file in entity_files:
+ with open(entity_file, 'r', encoding='utf-8') as f:
+ content = f.read()
+
+ # Check for class definition
+ class_name = entity_file.stem # filename without extension
+ class_pattern = rf'class\s+{class_name}\s*:'
+
+ self.assertTrue(re.search(class_pattern, content),
+ f"Class {class_name} should be defined in {entity_file}")
+
+ def test_base_entity_inheritance(self):
+ """Test that entities inherit from BaseEntity appropriately."""
+ entity_files = list(self.assets_dir.rglob("*Entity.cs"))
+ entity_files = [f for f in entity_files if "BaseEntity" not in f.name]
+
+ if len(entity_files) == 0:
+ self.skipTest("No derived entity script files found")
+
+ for entity_file in entity_files:
+ with open(entity_file, 'r', encoding='utf-8') as f:
+ content = f.read()
+
+ # Check for inheritance from BaseEntity or MonoBehaviour
+ class_name = entity_file.stem
+ inheritance_patterns = [
+ rf'class\s+{class_name}\s*:\s*BaseEntity',
+ rf'class\s+{class_name}\s*:\s*MonoBehaviour',
+ rf'class\s+{class_name}\s*:\s*\w+Entity' # Might inherit from other entities
+ ]
+
+ has_inheritance = any(re.search(pattern, content) for pattern in inheritance_patterns)
+ self.assertTrue(has_inheritance,
+ f"Entity {class_name} should inherit from BaseEntity or MonoBehaviour")
+
+ def test_unity_components_structure(self):
+ """Test that Unity component structure is maintained."""
+ # Check for proper Unity namespace usage
+ cs_files = list(self.assets_dir.rglob("*.cs"))
+
+ if len(cs_files) == 0:
+ self.skipTest("No C# files found")
+
+ for cs_file in cs_files:
+ with open(cs_file, 'r', encoding='utf-8') as f:
+ content = f.read()
+
+ # Should have Unity using statements
+ unity_usings = ['using UnityEngine', 'UnityEngine.']
+ has_unity_usage = any(usage in content for usage in unity_usings)
+
+ if 'MonoBehaviour' in content or 'GameObject' in content:
+ self.assertTrue(has_unity_usage,
+ f"File {cs_file} using Unity classes should have Unity imports")
+
+
+class TestEntityDocumentation(unittest.TestCase):
+ """Test entity documentation and metadata extraction."""
+
+ def test_entity_comments_and_documentation(self):
+ """Test that entities have proper documentation."""
+ assets_dir = Path("Assets/StraightFour/Entity")
+
+ if not assets_dir.exists():
+ self.skipTest("Entity directory not found")
+
+ entity_files = list(assets_dir.rglob("*Entity.cs"))
+
+ for entity_file in entity_files:
+ with open(entity_file, 'r', encoding='utf-8') as f:
+ content = f.read()
+
+ # Check for some form of documentation
+ has_documentation = any(marker in content for marker in [
+ '///', '/**', '/*', 'summary', 'Summary'
+ ])
+
+ # At minimum, should have some comments
+ has_comments = '//' in content or '/*' in content
+
+ self.assertTrue(has_comments or has_documentation,
+ f"Entity file {entity_file} should have some documentation or comments")
+
+ def test_entity_public_methods_extraction(self):
+ """Test that public methods can be extracted from entities."""
+ assets_dir = Path("Assets/StraightFour/Entity")
+
+ if not assets_dir.exists():
+ self.skipTest("Entity directory not found")
+
+ entity_files = list(assets_dir.rglob("*Entity.cs"))
+
+ for entity_file in entity_files:
+ with open(entity_file, 'r', encoding='utf-8') as f:
+ content = f.read()
+
+ # Find public methods
+ method_pattern = r'public\s+\w+\s+\w+\s*\('
+ public_methods = re.findall(method_pattern, content)
+
+ # Should have at least some public interface
+ # (Even if just Unity lifecycle methods)
+ if 'class' in content:
+ # This is more of an informational test
+ method_count = len(public_methods)
+ self.assertGreaterEqual(method_count, 0,
+ f"Entity {entity_file.stem} method count: {method_count}")
+
+
+class TestUIEntityStructure(unittest.TestCase):
+ """Test UI entity specific structure."""
+
+ def test_ui_entity_hierarchy(self):
+ """Test that UI entities have proper hierarchy."""
+ ui_dir = Path("Assets/StraightFour/Entity/UI")
+
+ if not ui_dir.exists():
+ self.skipTest("UI entity directory not found")
+
+ # Should have Canvas and Element subdirectories or similar
+ expected_ui_types = ["Canvas", "Element", "Button", "Input", "Text"]
+
+ ui_subdirs = [d.name for d in ui_dir.iterdir() if d.is_dir()]
+ ui_files = [f.name for f in ui_dir.rglob("*.cs")]
+
+ # At least some UI types should be present
+ found_ui_types = []
+ for ui_type in expected_ui_types:
+ if any(ui_type.lower() in name.lower() for name in ui_subdirs + ui_files):
+ found_ui_types.append(ui_type)
+
+ self.assertGreater(len(found_ui_types), 0,
+ "Should find at least some UI entity types")
+
+ def test_ui_component_structure(self):
+ """Test UI component structure matches expected patterns."""
+ ui_dir = Path("Assets/StraightFour/Entity/UI")
+
+ if not ui_dir.exists():
+ self.skipTest("UI entity directory not found")
+
+ ui_files = list(ui_dir.rglob("*.cs"))
+
+ for ui_file in ui_files:
+ with open(ui_file, 'r', encoding='utf-8') as f:
+ content = f.read()
+
+ # UI entities should likely reference Unity UI components
+ ui_patterns = [
+ 'UnityEngine.UI', 'Canvas', 'Button', 'Text', 'Image',
+ 'RectTransform', 'EventSystem'
+ ]
+
+ if 'UI' in ui_file.name:
+ has_ui_references = any(pattern in content for pattern in ui_patterns)
+ # This is informational - not all files may have direct UI references
+ if has_ui_references:
+ self.assertTrue(True, f"UI file {ui_file.name} has UI references")
+
+
+if __name__ == '__main__':
+ unittest.main()
\ No newline at end of file
diff --git a/tests/test_gltf_compatibility_analysis.py b/tests/test_gltf_compatibility_analysis.py
new file mode 100644
index 0000000..6fb4260
--- /dev/null
+++ b/tests/test_gltf_compatibility_analysis.py
@@ -0,0 +1,800 @@
+import unittest
+import json
+import os
+import tempfile
+import shutil
+from pathlib import Path
+import markdown
+import yaml
+
+class TestGLTFCompatibilityAnalysis(unittest.TestCase):
+ """Test suite for StraightFour-glTF compatibility analysis."""
+
+ def setUp(self):
+ """Set up test environment."""
+ self.test_docs_dir = Path("test_docs/compatibility")
+ self.test_docs_dir.mkdir(parents=True, exist_ok=True)
+
+ # Expected output files
+ self.matrix_file = self.test_docs_dir / "straightfour-gltf-matrix.md"
+ self.report_file = self.test_docs_dir / "straightfour-gltf-report.md"
+ self.mapping_file = self.test_docs_dir / "straightfour-gltf-mapping.json"
+
+ # Mock StraightFour entities for testing
+ self.mock_entities = {
+ "BaseEntity": {
+ "description": "Base entity class",
+ "fields": ["id", "name", "transform"],
+ "behaviors": ["placement", "serialization"],
+ "required_fields": ["id"],
+ "optional_fields": ["name", "transform"]
+ },
+ "AirplaneEntity": {
+ "description": "Aircraft entity",
+ "fields": ["id", "name", "transform", "flight_data"],
+ "behaviors": ["movement", "physics"],
+ "required_fields": ["id", "flight_data"],
+ "optional_fields": ["name", "transform"]
+ },
+ "AudioEntity": {
+ "description": "Audio entity",
+ "fields": ["id", "audio_clip", "volume"],
+ "behaviors": ["playback", "spatialization"],
+ "required_fields": ["id", "audio_clip"],
+ "optional_fields": ["volume"]
+ }
+ }
+
+ # Mock OMI glTF extensions for testing
+ self.mock_extensions = {
+ "OMI_physics_body": {
+ "purpose": "Add physics properties to nodes",
+ "schema_fields": ["type", "mass", "inertia"],
+ "required_fields": ["type"],
+ "optional_fields": ["mass", "inertia"],
+ "features": ["rigid_body", "collision", "mass_properties"],
+ "dependencies": []
+ },
+ "OMI_audio_emitter": {
+ "purpose": "Add audio emission to nodes",
+ "schema_fields": ["source", "volume", "loop"],
+ "required_fields": ["source"],
+ "optional_fields": ["volume", "loop"],
+ "features": ["audio_playback", "spatialization", "looping"],
+ "dependencies": []
+ },
+ "OMI_vehicle": {
+ "purpose": "Define vehicle properties",
+ "schema_fields": ["type", "engine", "wheels"],
+ "required_fields": ["type"],
+ "optional_fields": ["engine", "wheels"],
+ "features": ["vehicle_physics", "engine_simulation"],
+ "dependencies": ["OMI_physics_body"]
+ }
+ }
+
+ def tearDown(self):
+ """Clean up test environment."""
+ if self.test_docs_dir.exists():
+ shutil.rmtree(self.test_docs_dir.parent)
+
+ def test_matrix_file_exists(self):
+ """Test that compatibility matrix file exists."""
+ # Create a mock matrix file
+ self._create_mock_matrix_file()
+ self.assertTrue(self.matrix_file.exists())
+
+ def test_report_file_exists(self):
+ """Test that detailed report file exists."""
+ # Create a mock report file
+ self._create_mock_report_file()
+ self.assertTrue(self.report_file.exists())
+
+ def test_mapping_file_exists(self):
+ """Test that JSON mapping file exists."""
+ # Create a mock mapping file
+ self._create_mock_mapping_file()
+ self.assertTrue(self.mapping_file.exists())
+
+ def test_matrix_structure_valid(self):
+ """Test that compatibility matrix has valid structure."""
+ self._create_mock_matrix_file()
+
+ with open(self.matrix_file, 'r') as f:
+ content = f.read()
+
+ # Check for required columns
+ self.assertIn("Extension Name", content)
+ self.assertIn("StraightFour Entity", content)
+ self.assertIn("Support Level", content)
+ self.assertIn("Missing Features", content)
+ self.assertIn("Notes", content)
+
+ # Check for table formatting
+ self.assertIn("|", content)
+ self.assertIn("---", content)
+
+ def test_report_structure_valid(self):
+ """Test that detailed report has valid structure."""
+ self._create_mock_report_file()
+
+ with open(self.report_file, 'r') as f:
+ content = f.read()
+
+ # Check for required sections
+ required_sections = [
+ "Overview of StraightFour entity model",
+ "Overview of OMI glTF extension ecosystem",
+ "Mapping methodology",
+ "Per-extension analysis",
+ "Per-entity analysis",
+ "Summary of unsupported extensions",
+ "Summary of partially supported extensions",
+ "Recommendations for future support"
+ ]
+
+ for section in required_sections:
+ self.assertIn(section, content)
+
+ def test_mapping_json_valid(self):
+ """Test that JSON mapping file is valid JSON."""
+ self._create_mock_mapping_file()
+
+ with open(self.mapping_file, 'r') as f:
+ data = json.load(f)
+
+ # Validate JSON structure
+ self.assertIn("extensions", data)
+ self.assertIsInstance(data["extensions"], dict)
+
+ # Check extension structure
+ for ext_name, ext_data in data["extensions"].items():
+ self.assertIn("supported", ext_data)
+ self.assertIn("supported_features", ext_data)
+ self.assertIn("missing_features", ext_data)
+ self.assertIn("mapped_entities", ext_data)
+ self.assertIsInstance(ext_data["supported"], bool)
+ self.assertIsInstance(ext_data["supported_features"], list)
+ self.assertIsInstance(ext_data["missing_features"], list)
+ self.assertIsInstance(ext_data["mapped_entities"], list)
+
+ def test_support_levels_valid(self):
+ """Test that support levels are from valid set."""
+ self._create_mock_matrix_file()
+
+ with open(self.matrix_file, 'r') as f:
+ content = f.read()
+
+ valid_levels = ["Full", "Partial", "None"]
+ lines = content.split('\n')
+
+ # Find table rows (skip headers and separators)
+ data_rows = [line for line in lines if '|' in line and not '---' in line and not 'Extension Name' in line]
+
+ for row in data_rows:
+ if row.strip(): # Skip empty rows
+ columns = [col.strip() for col in row.split('|')[1:-1]] # Remove empty first/last elements
+ if len(columns) >= 3: # Ensure we have enough columns
+ support_level = columns[2] # Support Level column
+ self.assertIn(support_level, valid_levels, f"Invalid support level: {support_level}")
+
+ def test_all_entities_included(self):
+ """Test that all StraightFour entities are included in analysis."""
+ self._create_mock_matrix_file()
+
+ with open(self.matrix_file, 'r') as f:
+ content = f.read()
+
+ # Check that all mock entities are mentioned
+ for entity_name in self.mock_entities.keys():
+ self.assertIn(entity_name, content, f"Entity {entity_name} not found in matrix")
+
+ def test_all_extensions_included(self):
+ """Test that all OMI glTF extensions are included in analysis."""
+ self._create_mock_mapping_file()
+
+ with open(self.mapping_file, 'r') as f:
+ data = json.load(f)
+
+ # Check that all mock extensions are included
+ for ext_name in self.mock_extensions.keys():
+ self.assertIn(ext_name, data["extensions"], f"Extension {ext_name} not found in mapping")
+
+ def test_entity_extension_mapping_consistency(self):
+ """Test consistency between matrix and JSON mapping."""
+ self._create_mock_matrix_file()
+ self._create_mock_mapping_file()
+
+ # Read matrix
+ with open(self.matrix_file, 'r') as f:
+ matrix_content = f.read()
+
+ # Read JSON mapping
+ with open(self.mapping_file, 'r') as f:
+ mapping_data = json.load(f)
+
+ # Extract mappings from both files and compare
+ for ext_name, ext_data in mapping_data["extensions"].items():
+ # Check that extension appears in matrix
+ self.assertIn(ext_name, matrix_content, f"Extension {ext_name} in JSON but not in matrix")
+
+ # Check that mapped entities appear in matrix
+ for entity in ext_data["mapped_entities"]:
+ self.assertIn(entity, matrix_content, f"Entity {entity} in JSON but not in matrix")
+
+ def test_required_vs_optional_fields_tracked(self):
+ """Test that required vs optional fields are properly tracked."""
+ self._create_mock_mapping_file()
+
+ with open(self.mapping_file, 'r') as f:
+ data = json.load(f)
+
+ # Verify that the mapping considers field requirements
+ # This would be implementation-specific, but we can check structure
+ for ext_name, ext_data in data["extensions"].items():
+ if ext_name in self.mock_extensions:
+ mock_ext = self.mock_extensions[ext_name]
+ # The missing_features should account for required fields
+ self.assertIsInstance(ext_data["missing_features"], list)
+
+ def test_extension_dependencies_handled(self):
+ """Test that extension dependencies are properly handled."""
+ self._create_mock_report_file()
+
+ with open(self.report_file, 'r') as f:
+ content = f.read()
+
+ # Check that dependencies are mentioned
+ self.assertIn("dependencies", content.lower())
+
+ # For extensions with dependencies in mock data
+ for ext_name, ext_data in self.mock_extensions.items():
+ if ext_data["dependencies"]:
+ # Should mention the extension and its dependencies
+ self.assertIn(ext_name, content)
+
+ def test_partial_support_identification(self):
+ """Test that partial support is properly identified."""
+ self._create_mock_mapping_file()
+
+ with open(self.mapping_file, 'r') as f:
+ data = json.load(f)
+
+ # Look for extensions with partial support
+ partial_extensions = []
+ for ext_name, ext_data in data["extensions"].items():
+ if ext_data["supported"] and ext_data["missing_features"]:
+ partial_extensions.append(ext_name)
+
+ # Should have some partially supported extensions in realistic scenario
+ # This is a structural test - in real implementation there should be partial support
+
+ def test_unsupported_extensions_identified(self):
+ """Test that unsupported extensions are properly identified."""
+ self._create_mock_mapping_file()
+
+ with open(self.mapping_file, 'r') as f:
+ data = json.load(f)
+
+ # Look for completely unsupported extensions
+ unsupported_extensions = []
+ for ext_name, ext_data in data["extensions"].items():
+ if not ext_data["supported"]:
+ unsupported_extensions.append(ext_name)
+
+ # Verify structure is correct for unsupported extensions
+ for ext_name in unsupported_extensions:
+ ext_data = data["extensions"][ext_name]
+ self.assertEqual(len(ext_data["supported_features"]), 0)
+ self.assertEqual(len(ext_data["mapped_entities"]), 0)
+
+ def test_feature_level_granularity(self):
+ """Test that analysis includes feature-level granularity."""
+ self._create_mock_mapping_file()
+
+ with open(self.mapping_file, 'r') as f:
+ data = json.load(f)
+
+ # Check that features are tracked at granular level
+ for ext_name, ext_data in data["extensions"].items():
+ # Should have specific features listed
+ self.assertIsInstance(ext_data["supported_features"], list)
+ self.assertIsInstance(ext_data["missing_features"], list)
+
+ # At least one of these should be non-empty for realistic analysis
+ total_features = len(ext_data["supported_features"]) + len(ext_data["missing_features"])
+ self.assertGreaterEqual(total_features, 0) # Structure test
+
+ def test_markdown_syntax_valid(self):
+ """Test that markdown files have valid syntax."""
+ self._create_mock_matrix_file()
+ self._create_mock_report_file()
+
+ # Test matrix markdown
+ with open(self.matrix_file, 'r') as f:
+ matrix_content = f.read()
+
+ # Basic markdown table validation
+ lines = matrix_content.split('\n')
+ table_lines = [line for line in lines if '|' in line]
+
+ if table_lines:
+ # Check that all table rows have same number of columns
+ col_counts = [len(line.split('|')) - 2 for line in table_lines if not '---' in line] # -2 for empty start/end
+ if col_counts:
+ first_count = col_counts[0]
+ for count in col_counts:
+ self.assertEqual(count, first_count, "Inconsistent table column count")
+
+ # Test report markdown (basic structure)
+ with open(self.report_file, 'r') as f:
+ report_content = f.read()
+
+ # Should have headers
+ self.assertRegex(report_content, r'^#+ ', msg="Report should contain markdown headers")
+
+ def test_edge_case_multiple_entity_mappings(self):
+ """Test handling of extensions that map to multiple entities."""
+ self._create_mock_mapping_file()
+
+ with open(self.mapping_file, 'r') as f:
+ data = json.load(f)
+
+ # Look for extensions mapped to multiple entities
+ multi_mapped = []
+ for ext_name, ext_data in data["extensions"].items():
+ if len(ext_data["mapped_entities"]) > 1:
+ multi_mapped.append(ext_name)
+
+ # Verify structure is correct for multi-mapped extensions
+ for ext_name in multi_mapped:
+ ext_data = data["extensions"][ext_name]
+ self.assertIsInstance(ext_data["mapped_entities"], list)
+ for entity in ext_data["mapped_entities"]:
+ self.assertIsInstance(entity, str)
+
+ def test_edge_case_entity_no_extension_mapping(self):
+ """Test handling of entities with no glTF extension equivalent."""
+ self._create_mock_report_file()
+
+ with open(self.report_file, 'r') as f:
+ content = f.read()
+
+ # Should mention entities without mappings
+ self.assertIn("no clear", content.lower())
+
+ def _create_mock_matrix_file(self):
+ """Create a mock compatibility matrix file for testing."""
+ content = """# StraightFour-glTF Compatibility Matrix
+
+| Extension Name | StraightFour Entity | Support Level | Missing Features | Notes |
+|---|---|---|---|---|
+| OMI_physics_body | BaseEntity | Partial | inertia, mass_properties | Basic physics support |
+| OMI_audio_emitter | AudioEntity | Full | - | Complete audio support |
+| OMI_vehicle | AirplaneEntity | None | vehicle_physics, engine_simulation | No vehicle support |
+"""
+ with open(self.matrix_file, 'w') as f:
+ f.write(content)
+
+ def _create_mock_report_file(self):
+ """Create a mock detailed report file for testing."""
+ content = """# StraightFour-glTF Compatibility Report
+
+## Overview of StraightFour entity model
+StraightFour defines various entity types for spatial applications.
+
+## Overview of OMI glTF extension ecosystem
+OMI glTF extensions provide standardized ways to extend glTF capabilities.
+
+## Mapping methodology
+We analyzed each StraightFour entity against OMI glTF extensions.
+
+## Per-extension analysis
+### OMI_physics_body
+- Purpose: Add physics properties to nodes
+- Dependencies: none
+
+### OMI_audio_emitter
+- Purpose: Add audio emission to nodes
+- Dependencies: none
+
+## Per-entity analysis
+### BaseEntity
+Basic entity with transform and placement capabilities.
+
+### AudioEntity
+Audio playback entity with spatialization.
+
+## Summary of unsupported extensions
+Extensions with no clear StraightFour equivalent.
+
+## Summary of partially supported extensions
+Extensions with incomplete feature support.
+
+## Recommendations for future support
+Priority extensions for implementation.
+"""
+ with open(self.report_file, 'w') as f:
+ f.write(content)
+
+ def _create_mock_mapping_file(self):
+ """Create a mock JSON mapping file for testing."""
+ mapping_data = {
+ "extensions": {
+ "OMI_physics_body": {
+ "supported": True,
+ "supported_features": ["rigid_body", "collision"],
+ "missing_features": ["mass_properties"],
+ "mapped_entities": ["BaseEntity"]
+ },
+ "OMI_audio_emitter": {
+ "supported": True,
+ "supported_features": ["audio_playback", "spatialization", "looping"],
+ "missing_features": [],
+ "mapped_entities": ["AudioEntity"]
+ },
+ "OMI_vehicle": {
+ "supported": False,
+ "supported_features": [],
+ "missing_features": ["vehicle_physics", "engine_simulation"],
+ "mapped_entities": []
+ }
+ }
+ }
+
+ with open(self.mapping_file, 'w') as f:
+ json.dump(mapping_data, f, indent=2)
+
+
+class TestStraightFourEntityExtraction(unittest.TestCase):
+ """Test suite for StraightFour entity extraction functionality."""
+
+ def setUp(self):
+ """Set up test environment for entity extraction."""
+ self.test_assets_dir = Path("test_assets/StraightFour")
+ self.test_assets_dir.mkdir(parents=True, exist_ok=True)
+
+ def tearDown(self):
+ """Clean up test environment."""
+ if self.test_assets_dir.exists():
+ shutil.rmtree(self.test_assets_dir.parent)
+
+ def test_entity_script_parsing(self):
+ """Test parsing of entity C# scripts."""
+ # Create mock entity script
+ mock_script = """using UnityEngine;
+
+namespace StraightFour.Entity.Base
+{
+ ///
+ /// Base entity class for all StraightFour entities.
+ ///
+ public class BaseEntity : MonoBehaviour
+ {
+ [SerializeField]
+ public string id;
+
+ public string name;
+
+ public Transform transform;
+
+ public virtual void Initialize()
+ {
+ // Base initialization
+ }
+ }
+}
+"""
+
+ script_file = self.test_assets_dir / "Entity" / "Base" / "Scripts" / "BaseEntity.cs"
+ script_file.parent.mkdir(parents=True, exist_ok=True)
+
+ with open(script_file, 'w') as f:
+ f.write(mock_script)
+
+ # Test that file exists and can be read
+ self.assertTrue(script_file.exists())
+
+ with open(script_file, 'r') as f:
+ content = f.read()
+
+ # Basic parsing tests
+ self.assertIn("BaseEntity", content)
+ self.assertIn("public string id", content)
+ self.assertIn("MonoBehaviour", content)
+
+ def test_entity_directory_structure(self):
+ """Test entity directory structure parsing."""
+ # Create mock directory structure
+ entities = ["Base", "Airplane", "Audio", "Automobile"]
+
+ for entity in entities:
+ entity_dir = self.test_assets_dir / "Entity" / entity
+ entity_dir.mkdir(parents=True, exist_ok=True)
+
+ # Create Scripts subdirectory
+ scripts_dir = entity_dir / "Scripts"
+ scripts_dir.mkdir(exist_ok=True)
+
+ # Create mock script file
+ script_file = scripts_dir / f"{entity}Entity.cs"
+ with open(script_file, 'w') as f:
+ f.write(f"// Mock {entity}Entity script")
+
+ # Verify structure
+ for entity in entities:
+ entity_dir = self.test_assets_dir / "Entity" / entity
+ self.assertTrue(entity_dir.exists())
+
+ scripts_dir = entity_dir / "Scripts"
+ self.assertTrue(scripts_dir.exists())
+
+ def test_entity_metadata_extraction(self):
+ """Test extraction of entity metadata from Unity meta files."""
+ # Create mock .meta file
+ meta_content = """fileFormatVersion: 2
+guid: 12345678901234567890123456789012
+MonoImporter:
+ externalObjects: {}
+ serializedVersion: 2
+ defaultReferences: []
+ executionOrder: 0
+ icon: {instanceID: 0}
+ userData:
+ assetBundleName:
+ assetBundleVariant:
+"""
+
+ meta_file = self.test_assets_dir / "Entity" / "Base" / "Scripts" / "BaseEntity.cs.meta"
+ meta_file.parent.mkdir(parents=True, exist_ok=True)
+
+ with open(meta_file, 'w') as f:
+ f.write(meta_content)
+
+ # Test meta file parsing
+ self.assertTrue(meta_file.exists())
+
+ with open(meta_file, 'r') as f:
+ content = f.read()
+
+ self.assertIn("guid:", content)
+ self.assertIn("MonoImporter:", content)
+
+
+class TestOMIGLTFExtensionExtraction(unittest.TestCase):
+ """Test suite for OMI glTF extension extraction functionality."""
+
+ def setUp(self):
+ """Set up test environment for extension extraction."""
+ self.test_extensions_dir = Path("test_omi_gltf_extensions")
+ self.test_extensions_dir.mkdir(parents=True, exist_ok=True)
+
+ def tearDown(self):
+ """Clean up test environment."""
+ if self.test_extensions_dir.exists():
+ shutil.rmtree(self.test_extensions_dir)
+
+ def test_extension_readme_parsing(self):
+ """Test parsing of extension README files."""
+ # Create mock extension directory and README
+ ext_dir = self.test_extensions_dir / "extensions" / "2.0" / "OMI_physics_body"
+ ext_dir.mkdir(parents=True, exist_ok=True)
+
+ readme_content = """# OMI_physics_body
+
+## Contributors
+
+## Status
+
+## Dependencies
+
+## Overview
+
+This extension adds physics properties to glTF nodes.
+
+## Schema
+
+### Node
+
+| Property | Type | Description | Required |
+|----------|------|-------------|----------|
+| type | string | Physics body type | Yes |
+| mass | number | Mass of the body | No |
+| inertia | object | Inertia tensor | No |
+
+## Features
+
+- Rigid body physics
+- Collision detection
+- Mass properties
+"""
+
+ readme_file = ext_dir / "README.md"
+ with open(readme_file, 'w') as f:
+ f.write(readme_content)
+
+ # Test README parsing
+ self.assertTrue(readme_file.exists())
+
+ with open(readme_file, 'r') as f:
+ content = f.read()
+
+ self.assertIn("OMI_physics_body", content)
+ self.assertIn("physics properties", content)
+ self.assertIn("| type | string |", content)
+
+ def test_extension_schema_parsing(self):
+ """Test parsing of extension JSON schema files."""
+ # Create mock schema file
+ ext_dir = self.test_extensions_dir / "extensions" / "2.0" / "OMI_audio_emitter"
+ ext_dir.mkdir(parents=True, exist_ok=True)
+
+ schema_content = {
+ "$schema": "https://json-schema.org/draft/2020-12/schema",
+ "title": "OMI_audio_emitter Node Extension",
+ "type": "object",
+ "properties": {
+ "source": {
+ "type": "integer",
+ "description": "Index of audio source"
+ },
+ "volume": {
+ "type": "number",
+ "description": "Audio volume",
+ "minimum": 0.0,
+ "maximum": 1.0
+ },
+ "loop": {
+ "type": "boolean",
+ "description": "Whether audio loops"
+ }
+ },
+ "required": ["source"]
+ }
+
+ schema_file = ext_dir / "schema" / "node.OMI_audio_emitter.schema.json"
+ schema_file.parent.mkdir(exist_ok=True)
+
+ with open(schema_file, 'w') as f:
+ json.dump(schema_content, f, indent=2)
+
+ # Test schema parsing
+ self.assertTrue(schema_file.exists())
+
+ with open(schema_file, 'r') as f:
+ data = json.load(f)
+
+ self.assertEqual(data["title"], "OMI_audio_emitter Node Extension")
+ self.assertIn("source", data["properties"])
+ self.assertIn("source", data["required"])
+
+ def test_extension_directory_structure(self):
+ """Test OMI extension directory structure parsing."""
+ # Create mock extension structure
+ extensions = [
+ "OMI_physics_body",
+ "OMI_audio_emitter",
+ "OMI_vehicle",
+ "OMI_seat"
+ ]
+
+ for ext_name in extensions:
+ ext_dir = self.test_extensions_dir / "extensions" / "2.0" / ext_name
+ ext_dir.mkdir(parents=True, exist_ok=True)
+
+ # Create README
+ readme = ext_dir / "README.md"
+ with open(readme, 'w') as f:
+ f.write(f"# {ext_name}\n\nExtension description.")
+
+ # Create schema directory
+ schema_dir = ext_dir / "schema"
+ schema_dir.mkdir(exist_ok=True)
+
+ # Verify structure
+ for ext_name in extensions:
+ ext_dir = self.test_extensions_dir / "extensions" / "2.0" / ext_name
+ self.assertTrue(ext_dir.exists())
+
+ readme = ext_dir / "README.md"
+ self.assertTrue(readme.exists())
+
+
+class TestCompatibilityMapping(unittest.TestCase):
+ """Test suite for compatibility mapping logic."""
+
+ def test_semantic_mapping_rules(self):
+ """Test semantic mapping between entities and extensions."""
+ # Mock entity data
+ entity = {
+ "name": "AudioEntity",
+ "fields": ["audio_clip", "volume", "spatialization"],
+ "behaviors": ["playback", "3d_audio"]
+ }
+
+ # Mock extension data
+ extension = {
+ "name": "OMI_audio_emitter",
+ "fields": ["source", "volume", "loop"],
+ "features": ["audio_playback", "spatialization"]
+ }
+
+ # Test mapping logic (simplified)
+ common_concepts = set(["audio", "volume"])
+ entity_concepts = set(["audio_clip", "volume", "spatialization", "playback", "3d_audio"])
+ extension_concepts = set(["source", "volume", "loop", "audio_playback", "spatialization"])
+
+ # Should find some overlap
+ overlap = entity_concepts.intersection(extension_concepts)
+ self.assertTrue(len(overlap) > 0)
+
+ def test_support_level_determination(self):
+ """Test logic for determining support levels."""
+ # Test full support
+ entity_features = set(["audio_playback", "volume_control", "spatialization"])
+ extension_features = set(["audio_playback", "volume_control"])
+
+ if extension_features.issubset(entity_features):
+ support_level = "Full"
+ elif len(entity_features.intersection(extension_features)) > 0:
+ support_level = "Partial"
+ else:
+ support_level = "None"
+
+ self.assertEqual(support_level, "Full")
+
+ # Test partial support
+ entity_features = set(["audio_playback"])
+ extension_features = set(["audio_playback", "volume_control", "spatialization"])
+
+ if extension_features.issubset(entity_features):
+ support_level = "Full"
+ elif len(entity_features.intersection(extension_features)) > 0:
+ support_level = "Partial"
+ else:
+ support_level = "None"
+
+ self.assertEqual(support_level, "Partial")
+
+ # Test no support
+ entity_features = set(["visual_effects"])
+ extension_features = set(["audio_playback", "volume_control"])
+
+ if extension_features.issubset(entity_features):
+ support_level = "Full"
+ elif len(entity_features.intersection(extension_features)) > 0:
+ support_level = "Partial"
+ else:
+ support_level = "None"
+
+ self.assertEqual(support_level, "None")
+
+ def test_missing_features_identification(self):
+ """Test identification of missing features."""
+ entity_features = set(["audio_playback", "volume_control"])
+ extension_features = set(["audio_playback", "volume_control", "spatialization", "looping"])
+
+ missing_features = extension_features - entity_features
+ expected_missing = set(["spatialization", "looping"])
+
+ self.assertEqual(missing_features, expected_missing)
+
+ def test_required_field_handling(self):
+ """Test handling of required vs optional fields."""
+ extension_required = set(["source"])
+ extension_optional = set(["volume", "loop"])
+ entity_fields = set(["audio_clip", "volume"])
+
+ # Map entity fields to extension fields (simplified)
+ field_mapping = {"audio_clip": "source", "volume": "volume"}
+ mapped_entity_fields = set(field_mapping.values())
+
+ missing_required = extension_required - mapped_entity_fields
+ missing_optional = extension_optional - mapped_entity_fields
+
+ self.assertEqual(len(missing_required), 0) # Required field is mapped
+ self.assertEqual(missing_optional, set(["loop"])) # Loop is missing
+
+
+if __name__ == '__main__':
+ unittest.main()
\ No newline at end of file