Y by
Problem Statement
Given a positive integer n \geq 3 , consider a permutation \pi = (a_1, a_2, \dots, a_n) of \{1, 2, \dots, n\} . For each i ( 1 \leq i \leq n-1 ), define d_i as the minimum absolute difference between a_i and any subsequent element a_j ( j > i ), i.e.,
d_i = \min \{ |a_i - a_j| \mid j > i \}.
Let S_n denote the maximum possible sum of d_i over all permutations of \{1, \dots, n\} , i.e.,
S_n = \max_{\pi} \sum_{i=1}^{n-1} d_i.
Proposed Construction
I found a method to construct a permutation that seems to maximize \sum d_i :
1. Fix a_{n-1} = 1 and a_n = n .
2. For each i (from n-2 down to 1 ):
- Sort a_{i+1}, a_{i+2}, \dots, a_n in increasing order.
- Compute the gaps between consecutive elements.
- Place a_i in the middle of the largest gap (if the gap has even length, choose the smaller midpoint).
Partial Results
1. I can prove that 1 and n must occupy the last two positions. Otherwise, moving either 1 or n further right does not decrease \sum d_i .
2. The construction greedily maximizes each d_i locally, but I’m unsure if this ensures global optimality.
Request for Help
- Does this construction always yield the maximum S_n ?
- If yes, how can we rigorously prove it? (Induction? Exchange arguments?)
- If no, what is the correct approach?
Observations:
- The construction works for small n (e.g., n=3,4,5,...,12 ).
- The problem resembles optimizing "minimum gaps" in permutations.
Any insights or references would be greatly appreciated!
Given a positive integer n \geq 3 , consider a permutation \pi = (a_1, a_2, \dots, a_n) of \{1, 2, \dots, n\} . For each i ( 1 \leq i \leq n-1 ), define d_i as the minimum absolute difference between a_i and any subsequent element a_j ( j > i ), i.e.,
d_i = \min \{ |a_i - a_j| \mid j > i \}.
Let S_n denote the maximum possible sum of d_i over all permutations of \{1, \dots, n\} , i.e.,
S_n = \max_{\pi} \sum_{i=1}^{n-1} d_i.
Proposed Construction
I found a method to construct a permutation that seems to maximize \sum d_i :
1. Fix a_{n-1} = 1 and a_n = n .
2. For each i (from n-2 down to 1 ):
- Sort a_{i+1}, a_{i+2}, \dots, a_n in increasing order.
- Compute the gaps between consecutive elements.
- Place a_i in the middle of the largest gap (if the gap has even length, choose the smaller midpoint).
Partial Results
1. I can prove that 1 and n must occupy the last two positions. Otherwise, moving either 1 or n further right does not decrease \sum d_i .
2. The construction greedily maximizes each d_i locally, but I’m unsure if this ensures global optimality.
Request for Help
- Does this construction always yield the maximum S_n ?
- If yes, how can we rigorously prove it? (Induction? Exchange arguments?)
- If no, what is the correct approach?
Observations:
- The construction works for small n (e.g., n=3,4,5,...,12 ).
- The problem resembles optimizing "minimum gaps" in permutations.
Any insights or references would be greatly appreciated!