Comparison Tab
The Comparison tab helps you understand how different versions of your Digital Twin differ. As you refine your model—adding constraints, changing included fields, updating data—versions accumulate. The Comparison tab makes it easy to see exactly what changed.
Model evolution is part of the causal discovery process. You iterate: run discovery, review results, add domain knowledge, re-run. The Comparison tab ensures you don't lose track of how your model has evolved and helps you understand the impact of each change.
(SCREENSHOT: Comparison tab showing two versions side-by-side)
Selecting Versions to Compare
Start by choosing which versions to examine:
Version Selectors
Two dropdown selectors let you pick:
Base version – The reference point (often the current or production version)
Compare version – The version you want to compare against
Swapping Versions
Click the swap button to reverse which version is base vs. compare.
(SCREENSHOT: Version selector dropdowns with swap button)
Causal Graph Differences
The primary comparison shows how the causal graphs differ:
Side-by-Side Graphs
Both versions' graphs are displayed:
Nodes unique to one version are highlighted
Edges that differ are color-coded
Shared structure is shown in neutral colors
Difference Types:
Green edge
Added in compare version
Red edge
Removed in compare version
Yellow edge
Direction changed
Gray edge
Unchanged
(SCREENSHOT: Side-by-side causal graphs with color-coded differences)
Variable Mapping
When comparing versions with different fields, variable mapping shows correspondence:
Matched Variables
Variables that appear in both versions, possibly with different names.
Added Variables
Variables in the compare version but not the base.
Removed Variables
Variables in the base version but not the compare.
This is especially useful when data views have changed between versions.
(SCREENSHOT: Variable mapping table showing matched, added, and removed)
Parameter Comparison
Beyond graph structure, versions may differ in parameters:
Model Parameters
Statistical parameters learned during training:
Coefficients
Distributions
Relationship strengths
Configuration Differences
Settings that differed between versions:
Included/excluded fields
Temporal dependencies
Fixed subgraph constraints
(SCREENSHOT: Parameter comparison showing numerical differences)
Causal Diffs Detail
The detailed diff view lists every change:
Edge Changes Table
Source
Starting variable
Target
Ending variable
Base
Status in base version
Compare
Status in compare version
Change
Type of change
Change Types:
Added – Edge exists in compare but not base
Removed – Edge exists in base but not compare
Direction Changed – Edge exists in both but points differently
Strength Changed – Edge exists in both with different strength
(SCREENSHOT: Causal diffs table showing detailed edge changes)
Use Cases
Understanding Iteration Impact
After adding a constraint or changing data:
Select previous version as base
Select new version as compare
See exactly what changed in the graph
Regression Testing
Before deploying a new version:
Compare against production version
Verify changes are intentional
Check for unexpected edge additions/removals
Documentation
When explaining model changes to stakeholders:
Show the comparison view
Walk through significant differences
Explain reasoning for each change
Debugging
If a new version behaves unexpectedly:
Compare against working version
Identify structural differences
Investigate whether changes explain the behavior
(SCREENSHOT: Example workflow using comparison for model debugging)
Interpreting Differences
Many Differences = Major Change
If comparing versions shows many edge changes, the data or configuration changed significantly. This might be intentional (new data source) or warrant investigation.
Few Differences = Refinement
Minor changes suggest iterative refinement—perhaps adding a known relationship or blocking a spurious one.
No Differences
If two versions are identical, the changes were purely in configuration (like field exclusion) without affecting the discovered structure.
Best Practices
Document Before Comparing
Note what changes you made before creating a new version. The comparison confirms whether those changes had the intended effect.
Compare Sequentially
When reviewing history, compare adjacent versions (v1 to v2, v2 to v3) rather than jumping (v1 to v5). This helps track incremental evolution.
Use for Quality Assurance
Before using a new version for important simulations, compare against a trusted version to verify it's reasonable.
Export for Records
Save comparison screenshots or data for model governance documentation.
Next Steps
After comparing versions:
If satisfied with new version, use it for Simulations
If differences are unexpected, investigate in Relationships Tab
If reverting, select the desired version from Home Tab
Full Model Comparison
For comprehensive model comparison including cross-twin comparisons, parameter deep-dives, and side-by-side graph visualization with diff overlays, use the dedicated Model Comparison page (Intelligence → Compare Models).
Last updated

