Skip to content

Minimize Sankey flow crossings with barycenter sort#51682

Merged
silamon merged 3 commits intodevfrom
fix-28764
Apr 26, 2026
Merged

Minimize Sankey flow crossings with barycenter sort#51682
silamon merged 3 commits intodevfrom
fix-28764

Conversation

@MindFreeze
Copy link
Copy Markdown
Member

@MindFreeze MindFreeze commented Apr 23, 2026

Proposed change

Replace the pass-through-only ordering in the custom Sankey layout with a value-weighted barycenter sweep. Two alternating passes (L→R, R→L) iterate up to four times, sorting every node in each section by the weighted average position of its neighbours in the adjacent (already-sorted) section. Real nodes and pass-throughs are handled uniformly, including multi-depth edges via the existing passthrough id format.

Each candidate reorder is gated on a crossing count for the section's adjacent pairs and is only kept when crossings strictly decrease — so user-intended ordering is preserved whenever the barycenter would shuffle nodes without actually improving the layout.

The seed sort in ha-sankey-chart.ts now averages across all incoming links rather than only the first, which also improves the starting point.

This means that the order of the devices from the energy configuration is no longer guaranteed but I think it's worth it and, judging by the number of bug reports for crossed connections, looks like others think so too.

Performance will be impacted a bit by this but it shouldn't be a problem unless someone has 1000 devices.

Screenshots

Type of change

  • Dependency upgrade
  • Bugfix (non-breaking change which fixes an issue)
  • New feature (thank you!)
  • Breaking change (fix/feature causing existing functionality to break)
  • Code quality improvements to existing code or addition of tests

Additional information

Checklist

  • I understand the code I am submitting and can explain how it works.
  • The code change is tested and works locally.
  • There is no commented out code in this PR.
  • I have followed the perfect PR recommendations
  • Any generated code has been carefully reviewed for correctness and compliance with project standards.

If user exposed functionality or configuration variables are added/changed:

To help with the load of incoming pull requests:

@MindFreeze MindFreeze marked this pull request as ready for review April 23, 2026 14:23
Copy link
Copy Markdown
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

Improves the Energy Sankey layout by introducing a value-weighted barycenter sweep to reduce link crossings, including correct handling of pass-through nodes and multi-depth edges.

Changes:

  • Replaces the prior section ordering logic with an iterative L→R / R→L barycenter-based sort that only applies reorderings when crossings strictly decrease.
  • Adds utilities to support barycenter sorting and consistent pass-through node IDs.
  • Expands unit test coverage for barycenter computation and section sorting behavior (including passthrough and multi-parent cases).

Reviewed changes

Copilot reviewed 3 out of 3 changed files in this pull request and generated 2 comments.

File Description
src/resources/echarts/components/sankey/sankey-layout.ts Implements barycenter-based iterative node ordering with crossing-count gating and passthrough-aware neighbor handling.
src/components/chart/ha-sankey-chart.ts Improves the initial (seed) ordering by averaging across all incoming parents instead of only the first.
test/resources/echarts/components/sankey/sankey-layout.test.ts Adds targeted tests for barycenter math and crossing-reducing sort behavior, including passthrough alignment and idempotency.

Comment on lines +406 to +417
function countCrossings(segments: EdgeSegment[]): number {
let crossings = 0;
for (let i = 0; i < segments.length; i++) {
for (let j = i + 1; j < segments.length; j++) {
const a = segments[i];
const b = segments[j];
if ((a.sourceIdx - b.sourceIdx) * (a.targetIdx - b.targetIdx) < 0) {
crossings++;
}
}
}
return crossings;
Copy link

Copilot AI Apr 26, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

countCrossings() is O(n²) over the number of segments, and tryReplace() recomputes crossings (including rescanning all edges) twice per candidate replacement. With larger graphs this can become a noticeable bottleneck. Consider switching to an inversion-count approach (O(n log n)) for crossings between two sections (sort by sourceIdx and count inversions of targetIdx), and/or caching the segments per adjacent depth pair so you don’t rebuild them for both the before and after calculation on every attempt.

Suggested change
function countCrossings(segments: EdgeSegment[]): number {
let crossings = 0;
for (let i = 0; i < segments.length; i++) {
for (let j = i + 1; j < segments.length; j++) {
const a = segments[i];
const b = segments[j];
if ((a.sourceIdx - b.sourceIdx) * (a.targetIdx - b.targetIdx) < 0) {
crossings++;
}
}
}
return crossings;
function countInversions(values: number[]): number {
if (values.length < 2) {
return 0;
}
const scratch = new Array<number>(values.length);
const mergeSortAndCount = (left: number, right: number): number => {
if (right - left <= 1) {
return 0;
}
const mid = Math.floor((left + right) / 2);
let inversions =
mergeSortAndCount(left, mid) + mergeSortAndCount(mid, right);
let i = left;
let j = mid;
let k = left;
while (i < mid && j < right) {
if (values[i] <= values[j]) {
scratch[k++] = values[i++];
} else {
scratch[k++] = values[j++];
inversions += mid - i;
}
}
while (i < mid) {
scratch[k++] = values[i++];
}
while (j < right) {
scratch[k++] = values[j++];
}
for (let idx = left; idx < right; idx++) {
values[idx] = scratch[idx];
}
return inversions;
};
return mergeSortAndCount(0, values.length);
}
function countCrossings(segments: EdgeSegment[]): number {
if (segments.length < 2) {
return 0;
}
const orderedTargets = [...segments]
.sort(
(a, b) =>
a.sourceIdx - b.sourceIdx || a.targetIdx - b.targetIdx
)
.map((segment) => segment.targetIdx);
return countInversions(orderedTargets);

Copilot uses AI. Check for mistakes.
Comment on lines 293 to +304
private _findParentIndex(id: string, links: Link[], sections: Node[][]) {
const parent = links.find((l) => l.target === id)?.source;
if (!parent) {
const parents = links.filter((l) => l.target === id).map((l) => l.source);
if (parents.length === 0) {
return -1;
}
let offset = 0;
for (let i = sections.length - 1; i >= 0; i--) {
const section = sections[i];
const index = section.findIndex((n) => n.id === parent);
if (index !== -1) {
return offset + index;
let sum = 0;
let count = 0;
for (const parent of parents) {
let offset = 0;
for (let i = sections.length - 1; i >= 0; i--) {
const section = sections[i];
const index = section.findIndex((n) => n.id === parent);
Copy link

Copilot AI Apr 26, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

In the seed sort, _findParentIndex() is called inside the Array.sort() comparator, and it does a full links.filter(...) scan for every comparator call. This can blow up to O(nodes log nodes * links) work per section. Consider precomputing a Map<targetId, sourceIds[]> (or directly Map<targetId, avgParentIndex>) once before sorting nodesWithIndex, so the comparator becomes O(1) lookups instead of repeatedly scanning links.

Copilot uses AI. Check for mistakes.
@silamon silamon merged commit 5a296fa into dev Apr 26, 2026
16 checks passed
@silamon silamon deleted the fix-28764 branch April 26, 2026 19:42
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

3 participants