Summer is a great time to explore cool problems to keep your skills sharp!  Schedule a class today!

G
Topic
First Poster
Last Poster
k a June Highlights and 2025 AoPS Online Class Information
jlacosta   0
Jun 2, 2025
Congratulations to all the mathletes who competed at National MATHCOUNTS! If you missed the exciting Countdown Round, you can watch the video at this link. Are you interested in training for MATHCOUNTS or AMC 10 contests? How would you like to train for these math competitions in half the time? We have accelerated sections which meet twice per week instead of once starting on July 8th (7:30pm ET). These sections fill quickly so enroll today!

[list][*]MATHCOUNTS/AMC 8 Basics
[*]MATHCOUNTS/AMC 8 Advanced
[*]AMC 10 Problem Series[/list]
For those interested in Olympiad level training in math, computer science, physics, and chemistry, be sure to enroll in our WOOT courses before August 19th to take advantage of early bird pricing!

Summer camps are starting this month at the Virtual Campus in math and language arts that are 2 - to 4 - weeks in duration. Spaces are still available - don’t miss your chance to have a transformative summer experience. There are middle and high school competition math camps as well as Math Beasts camps that review key topics coupled with fun explorations covering areas such as graph theory (Math Beasts Camp 6), cryptography (Math Beasts Camp 7-8), and topology (Math Beasts Camp 8-9)!

Be sure to mark your calendars for the following upcoming events:
[list][*]June 5th, Thursday, 7:30pm ET: Open Discussion with Ben Kornell and Andrew Sutherland, Art of Problem Solving's incoming CEO Ben Kornell and CPO Andrew Sutherland host an Ask Me Anything-style chat. Come ask your questions and get to know our incoming CEO & CPO!
[*]June 9th, Monday, 7:30pm ET, Game Jam: Operation Shuffle!, Come join us to play our second round of Operation Shuffle! If you enjoy number sense, logic, and a healthy dose of luck, this is the game for you. No specific math background is required; all are welcome.[/list]
Our full course list for upcoming classes is below:
All classes run 7:30pm-8:45pm ET/4:30pm - 5:45pm PT unless otherwise noted.

Introductory: Grades 5-10

Prealgebra 1 Self-Paced

Prealgebra 1
Sunday, Jun 15 - Oct 12
Monday, Jun 30 - Oct 20
Wednesday, Jul 16 - Oct 29
Sunday, Aug 17 - Dec 14
Tuesday, Aug 26 - Dec 16
Friday, Sep 5 - Jan 16
Monday, Sep 8 - Jan 12
Tuesday, Sep 16 - Jan 20 (4:30 - 5:45 pm ET/1:30 - 2:45 pm PT)
Sunday, Sep 21 - Jan 25
Thursday, Sep 25 - Jan 29
Wednesday, Oct 22 - Feb 25
Tuesday, Nov 4 - Mar 10
Friday, Dec 12 - Apr 10

Prealgebra 2 Self-Paced

Prealgebra 2
Monday, Jun 2 - Sep 22
Sunday, Jun 29 - Oct 26
Friday, Jul 25 - Nov 21
Sunday, Aug 17 - Dec 14
Tuesday, Sep 9 - Jan 13
Thursday, Sep 25 - Jan 29
Sunday, Oct 19 - Feb 22
Monday, Oct 27 - Mar 2
Wednesday, Nov 12 - Mar 18

Introduction to Algebra A Self-Paced

Introduction to Algebra A
Sunday, Jun 15 - Oct 12
Thursday, Jun 26 - Oct 9
Tuesday, Jul 15 - Oct 28
Sunday, Aug 17 - Dec 14
Wednesday, Aug 27 - Dec 17
Friday, Sep 5 - Jan 16
Thursday, Sep 11 - Jan 15
Sunday, Sep 28 - Feb 1
Monday, Oct 6 - Feb 9
Tuesday, Oct 21 - Feb 24
Sunday, Nov 9 - Mar 15
Friday, Dec 5 - Apr 3

Introduction to Counting & Probability Self-Paced

Introduction to Counting & Probability
Sunday, Jun 1 - Aug 24
Thursday, Jun 12 - Aug 28
Wednesday, Jul 2 - Sep 17
Sunday, Jul 27 - Oct 19
Monday, Aug 11 - Nov 3
Wednesday, Sep 3 - Nov 19
Sunday, Sep 21 - Dec 14 (1:00 - 2:30 pm ET/10:00 - 11:30 am PT)
Friday, Oct 3 - Jan 16
Tuesday, Nov 4 - Feb 10
Sunday, Dec 7 - Mar 8

Introduction to Number Theory
Monday, Jun 9 - Aug 25
Sunday, Jun 15 - Sep 14
Tuesday, Jul 15 - Sep 30
Wednesday, Aug 13 - Oct 29
Friday, Sep 12 - Dec 12
Sunday, Oct 26 - Feb 1
Monday, Dec 1 - Mar 2

Introduction to Algebra B Self-Paced

Introduction to Algebra B
Wednesday, Jun 4 - Sep 17
Sunday, Jun 22 - Oct 19
Friday, Jul 18 - Nov 14
Thursday, Aug 7 - Nov 20
Monday, Aug 18 - Dec 15
Sunday, Sep 7 - Jan 11
Thursday, Sep 11 - Jan 15
Wednesday, Sep 24 - Jan 28
Sunday, Oct 26 - Mar 1
Tuesday, Nov 4 - Mar 10
Monday, Dec 1 - Mar 30

Introduction to Geometry
Monday, Jun 16 - Dec 8
Friday, Jun 20 - Jan 9
Sunday, Jun 29 - Jan 11
Monday, Jul 14 - Jan 19
Wednesday, Aug 13 - Feb 11
Tuesday, Aug 26 - Feb 24
Sunday, Sep 7 - Mar 8
Thursday, Sep 11 - Mar 12
Wednesday, Sep 24 - Mar 25
Sunday, Oct 26 - Apr 26
Monday, Nov 3 - May 4
Friday, Dec 5 - May 29

Paradoxes and Infinity
Mon, Tue, Wed, & Thurs, Jul 14 - Jul 16 (meets every day of the week!)

Intermediate: Grades 8-12

Intermediate Algebra
Sunday, Jun 1 - Nov 23
Tuesday, Jun 10 - Nov 18
Wednesday, Jun 25 - Dec 10
Sunday, Jul 13 - Jan 18
Thursday, Jul 24 - Jan 22
Friday, Aug 8 - Feb 20
Tuesday, Aug 26 - Feb 24
Sunday, Sep 28 - Mar 29
Wednesday, Oct 8 - Mar 8
Sunday, Nov 16 - May 17
Thursday, Dec 11 - Jun 4

Intermediate Counting & Probability
Sunday, Jun 22 - Nov 2
Sunday, Sep 28 - Feb 15
Tuesday, Nov 4 - Mar 24

Intermediate Number Theory
Sunday, Jun 1 - Aug 24
Wednesday, Jun 18 - Sep 3
Wednesday, Sep 24 - Dec 17

Precalculus
Sunday, Jun 1 - Nov 9
Monday, Jun 30 - Dec 8
Wednesday, Aug 6 - Jan 21
Tuesday, Sep 9 - Feb 24
Sunday, Sep 21 - Mar 8
Monday, Oct 20 - Apr 6
Sunday, Dec 14 - May 31

Advanced: Grades 9-12

Olympiad Geometry
Tuesday, Jun 10 - Aug 26

Calculus
Wednesday, Jun 25 - Dec 17
Sunday, Sep 7 - Mar 15
Wednesday, Sep 24 - Apr 1
Friday, Nov 14 - May 22

Group Theory
Thursday, Jun 12 - Sep 11

Contest Preparation: Grades 6-12

MATHCOUNTS/AMC 8 Basics
Monday, Jun 2 - Aug 18
Thursday, Jun 12 - Aug 28
Sunday, Jun 22 - Sep 21
Tues & Thurs, Jul 8 - Aug 14 (meets twice a week!)
Sunday, Aug 17 - Nov 9
Wednesday, Sep 3 - Nov 19
Tuesday, Sep 16 - Dec 9
Sunday, Sep 21 - Dec 14 (1:00 - 2:30 pm ET/10:00 - 11:30 am PT)
Monday, Oct 6 - Jan 12
Thursday, Oct 16 - Jan 22
Tues, Thurs & Sun, Dec 9 - Jan 18 (meets three times a week!)

MATHCOUNTS/AMC 8 Advanced
Wednesday, Jun 11 - Aug 27
Sunday, Jun 22 - Sep 21
Tues & Thurs, Jul 8 - Aug 14 (meets twice a week!)
Sunday, Aug 17 - Nov 9
Tuesday, Aug 26 - Nov 11
Thursday, Sep 4 - Nov 20
Friday, Sep 12 - Dec 12
Monday, Sep 15 - Dec 8
Sunday, Oct 5 - Jan 11
Tues, Thurs & Sun, Dec 2 - Jan 11 (meets three times a week!)
Mon, Wed & Fri, Dec 8 - Jan 16 (meets three times a week!)

AMC 10 Problem Series
Sunday, Jun 1 - Aug 24
Thursday, Jun 12 - Aug 28
Tuesday, Jun 17 - Sep 2
Sunday, Jun 22 - Sep 21 (1:00 - 2:30 pm ET/10:00 - 11:30 am PT)
Monday, Jun 23 - Sep 15
Tues & Thurs, Jul 8 - Aug 14 (meets twice a week!)
Sunday, Aug 10 - Nov 2
Thursday, Aug 14 - Oct 30
Tuesday, Aug 19 - Nov 4
Mon & Wed, Sep 15 - Oct 22 (meets twice a week!)
Mon, Wed & Fri, Oct 6 - Nov 3 (meets three times a week!)
Tue, Thurs & Sun, Oct 7 - Nov 2 (meets three times a week!)

AMC 10 Final Fives
Monday, Jun 30 - Jul 21
Friday, Aug 15 - Sep 12
Sunday, Sep 7 - Sep 28
Tuesday, Sep 9 - Sep 30
Monday, Sep 22 - Oct 13
Sunday, Sep 28 - Oct 19 (1:00 - 2:30 pm ET/10:00 - 11:30 am PT)
Wednesday, Oct 8 - Oct 29
Thursday, Oct 9 - Oct 30

AMC 12 Problem Series
Thursday, Jun 12 - Aug 28
Sunday, Jun 22 - Sep 21
Wednesday, Aug 6 - Oct 22
Sunday, Aug 10 - Nov 2
Monday, Aug 18 - Nov 10
Mon & Wed, Sep 15 - Oct 22 (meets twice a week!)
Tues, Thurs & Sun, Oct 7 - Nov 2 (meets three times a week!)

AMC 12 Final Fives
Thursday, Sep 4 - Sep 25
Sunday, Sep 28 - Oct 19
Tuesday, Oct 7 - Oct 28

AIME Problem Series A
Thursday, Oct 23 - Jan 29

AIME Problem Series B
Sunday, Jun 22 - Sep 21
Tuesday, Sep 2 - Nov 18

F=ma Problem Series
Wednesday, Jun 11 - Aug 27
Tuesday, Sep 16 - Dec 9
Friday, Oct 17 - Jan 30

WOOT Programs
Visit the pages linked for full schedule details for each of these programs!


MathWOOT Level 1
MathWOOT Level 2
ChemWOOT
CodeWOOT
PhysicsWOOT

Programming

Introduction to Programming with Python
Sunday, Jun 15 - Sep 14 (1:00 - 2:30 pm ET/10:00 - 11:30 am PT)
Tuesday, Jun 17 - Sep 2
Monday, Jun 30 - Sep 22
Thursday, Aug 14 - Oct 30
Sunday, Sep 7 - Nov 23
Tuesday, Dec 2 - Mar 3

Intermediate Programming with Python
Sunday, Jun 1 - Aug 24
Monday, Jun 30 - Sep 22
Friday, Oct 3 - Jan 16

USACO Bronze Problem Series
Sunday, Jun 22 - Sep 1
Wednesday, Sep 3 - Dec 3
Thursday, Oct 30 - Feb 5
Tuesday, Dec 2 - Mar 3

Physics

Introduction to Physics
Sunday, Jun 15 - Sep 14
Monday, Jun 23 - Sep 15
Tuesday, Sep 2 - Nov 18
Sunday, Oct 5 - Jan 11
Wednesday, Dec 10 - Mar 11

Physics 1: Mechanics
Monday, Jun 23 - Dec 15
Sunday, Sep 21 - Mar 22
Sunday, Oct 26 - Apr 26

Relativity
Mon, Tue, Wed & Thurs, Jun 23 - Jun 26 (meets every day of the week!)
0 replies
jlacosta
Jun 2, 2025
0 replies
Linetown Mayor Admits Orz
Rijul saini   1
N a few seconds ago by YaoAOPS
Source: LMAO 2025 Day 1 Problem 2
Having won the elections in Linetown, Turbo the Snail has become mayor, and one of the most pressing issues he needs to work on is the road network. Linetown can be represented as a configuration of $2025$ lines
in the plane, of which no two are parallel and no three are concurrent.

There is one house in Linetown for each pairwise intersection of two lines. The $2025$ lines are used as roads by the townsfolk. In the past, the roads in Linetown used to be two-way, but this often led to residents accidentally cycling back to where they started.

Turbo wants to make each of the $2025$ roads one-way such that it is impossible for any resident to start at a house, follow the roads in the correct directions, and end up back at the original house. In how many ways can Turbo achieve this?

Proposed by Archit Manas
1 reply
1 viewing
Rijul saini
Yesterday at 6:59 PM
YaoAOPS
a few seconds ago
Functional equation: f(xf(y)+f(x)f(y))=xf(y)+f(xy)
Behappy0918   2
N 9 minutes ago by Behappy0918
Find all function $f: \mathbb{R} \to \mathbb{R}$ such that for all $x, y\in\mathbb{R}$, $$f(xf(y)+f(x)f(y))=xf(y)+f(xy)$$
2 replies
Behappy0918
Tuesday at 12:24 PM
Behappy0918
9 minutes ago
Painting Beads on Necklace
amuthup   47
N 31 minutes ago by ezpotd
Source: 2021 ISL C2
Let $n\ge 3$ be a fixed integer. There are $m\ge n+1$ beads on a circular necklace. You wish to paint the beads using $n$ colors, such that among any $n+1$ consecutive beads every color appears at least once. Find the largest value of $m$ for which this task is $\emph{not}$ possible.

Carl Schildkraut, USA
47 replies
amuthup
Jul 12, 2022
ezpotd
31 minutes ago
Onto the altitude'
TheUltimate123   4
N 37 minutes ago by EpicBird08
Source: Extension of nukelauncher's and my Mock AIME #15 (https://artofproblemsolving.com/community/c875089h1825979p12212193)
In triangle $ABC$, let $D$, $E$, and $F$ denote the feet of the altitudes from $A$, $B$, and $C$, respectively, and let $O$ denote the circumcenter of $\triangle ABC$. Points $X$ and $Y$ denote the projections of $E$ and $F$, respectively, onto $\overline{AD}$, and $Z=\overline{AO}\cap\overline{EF}$. There exists a point $T$ such that $\angle DTZ=90^\circ$ and $AZ=AT$. If $P=\overline{AD}\cap\overline{ZT}$ and $Q$ lies on $\overline{EF}$ such that $\overline{PQ}\parallel\overline{BC}$, prove that line $AQ$ bisects $\overline{BC}$.
4 replies
+1 w
TheUltimate123
May 19, 2019
EpicBird08
37 minutes ago
No more topics!
The Bank of Bath
TelMarin   100
N Apr 17, 2025 by Ihatecombin
Source: IMO 2019, problem 5
The Bank of Bath issues coins with an $H$ on one side and a $T$ on the other. Harry has $n$ of these coins arranged in a line from left to right. He repeatedly performs the following operation: if there are exactly $k>0$ coins showing $H$, then he turns over the $k$th coin from the left; otherwise, all coins show $T$ and he stops. For example, if $n=3$ the process starting with the configuration $THT$ would be $THT \to HHT  \to HTT \to TTT$, which stops after three operations.

(a) Show that, for each initial configuration, Harry stops after a finite number of operations.

(b) For each initial configuration $C$, let $L(C)$ be the number of operations before Harry stops. For example, $L(THT) = 3$ and $L(TTT) = 0$. Determine the average value of $L(C)$ over all $2^n$ possible initial configurations $C$.

Proposed by David Altizio, USA
100 replies
TelMarin
Jul 17, 2019
Ihatecombin
Apr 17, 2025
The Bank of Bath
G H J
Source: IMO 2019, problem 5
The post below has been deleted. Click to close.
This post has been deleted. Click here to see post.
AnSoLiN
72 posts
#98 • 1 Y
Y by ImSh95
Let $f_{xyd}(n)$ where $x,y\in\{H,T\}$ and $d\in\{L=\text{Left},R=\text{Right}\}$ be defined as the sum of the number of the operations for all initial configuration with the following operation: if there are exactly $k>0$ coins showing $x$, then he turns over the $k^\text{th}$ from the $d$ until all coins show $y$. Let $f^*_{xyd}(n)$ be the same as $f_{xyd}(n)$ except $(k+1)^\text{th}$ coin is turned over. Only $f_{HTL}$, $f_{THL}$, $f_{HTR}$, $f_{THR}$, $f^*_{HHL}$, $f^*_{TTL}$, $f^*_{HHR}$, $f^*_{TTR}$ make sense. Problem asks for $\alpha_n=\dfrac{f_{HTL}(n)}{2^n}$

Divide $2^n$ initial configurations into two classes, those that end with a $H$ and those that end with a $T$. Sum of the number of operations for $2^{n-1}$ initial configurations that end with a $T$ is clearly $f_{HTL}(n-1)$ as the last $T$ affects nothing. For the initial configurations that end with a $H$, to touch the last $H$, all coins should be showing $H$. After that point, it takes $n$ steps to make all of them $T$. To get to the point that all coins show $H$, we need a total of $f^*_{HHL}(n-1)$ operations, since the last $H$ contributes 1 to the total number of $H$s.

$$f_{HTL}(n)=f_{HTL}(n-1)+f^*_{HHL}(n-1)+2^{n-1}\cdot n$$$$f_{HHL}^*(n-1)\overset{(1)}{=}f_{TTL}^*(n-1)\overset{(2)}{=}f_{TTR}^*(n-1)\overset{(3)}{=}f_{HTL}(n-1)$$
(1) Initial configurations are symmetric wrt $H$ and $T$.
(2) Initial configurations are symmetric wrt $L$ and $R$ when $x=y$.
(3) The $k^\text{th}$ coin from left, where $k$ is the number of $H$s is the same as the $(l+1)^{th}$ coin from right, where $l$ is the number of $T$s.

Therefore, $f_{HTL}(n)=2\cdot f_{HTL}(n-1)+2^{n-1}\cdot n$ and $\alpha_n=\alpha_{n-1}+\dfrac{n}{2}$. Clearly, $\alpha_n=\dfrac{n^2+n}{4}$.
Z K Y
The post below has been deleted. Click to close.
This post has been deleted. Click here to see post.
john0512
4191 posts
#99
Y by
We first compute $L(1)=1/2, L(2)=3/2, L(3)=3, L(4)=5$ by going through all possible cases. This leads us to the following claim:

Claim: $L(n)=\frac{n(n+1)}{4}.$ We will show this by induction. We have already seen that this is satisfied for $n=1,2,3,4.$

Case 1: The string starts with H. Then, the expected number of moves is $L(n-1)+1$ since it will take on average $L(n-1)$ moves to make everything after it T and 1 additional move to finish.

Case 2: The string ends with T. Then, the expected number of moves is $L(n-1)$ since the final $T$ has no effect.

Case 3: The string starts with H and ends with T. Then, the expected number of moves is $L(n-2)+1$, since it takes $L(n-2)$ moves to convert the middle part into T's and one more move to finish.

Case 4: The string starts with T and ends with H. Then, it will take $L(n-2)$ moves to turn it into $TTT\cdots H$, and then another $n-1$ moves to turn it into $HHH\cdots H$, and $n$ moves to finish, so it will take expected $L(n-2)+2n-1$ moves.

Note that by PIE, (Case 1 + Case 2 - Case 3) covers all cases where it either starts with H or ends with T. Therefore, (Case 1 + Case 2 - Case 3 + Case 4) exactly covers all possible cases. Therefore, we have $$L(n)=\frac{1}{2}(L(n-1)+1)+\frac{1}{2}(L(n-1))-\frac{1}{4}(L(n-2)+1)+\frac{1}{4}(L(n-2)+2n-1).$$By our inductive hypothesis, we plug in $L(n-2)=\frac{(n-2)(n-1)}{4}$ and $L(n-1)=\frac{(n-1)n}{4},$ we get $$L(n)=\frac{n(n+1)}{4},$$so we are done by induction. Note that this also solves part (a) since if any case is infinite, then the expected value is infinite, contradiction.
Z K Y
The post below has been deleted. Click to close.
This post has been deleted. Click here to see post.
huashiliao2020
1292 posts
#100
Y by
ah... the Bank of Bath problem. not sure why it's named that, nor why i feel like it had to say Oslo in it somewhere, and also glad that i solved it because i thought it was notorious for being some 50 MOHS but apparently not

We claim the expected value of the process, $p_n$, is $\frac{n(n+1)}{4}$. We will proceed by induction, with base case $p_1=\frac12$. Here's the subcases in the induction:

(1) Ends in tails: This has average $p_{n-1}$.
(2) Starts in heads: This has average $p_{n-1}+1$ (since we can ignore the first coin and the process is the same, plus 1 when you need to flip the first at the end).
(3) Starts in tails, ends in heads: This is the same as starting in heads, ending in tails (call this situation (4)), because indices in the middle string are the same, except when you flip the first heads, instead of being all tails, it is HTT...TH, which takes another 2(n-1) flips (easily checked manually, by say induction on the number of Ts).

Then $$2^np_n=(1)+(2)+(3)-(4)=2^{n-1}p_{n-1}+2^{n-1}(p_{n-1}+1)+2^{n-2}((3)-(4))=2^np_{n-1}+2^{n-1}+2^{n-1}(n-1)\implies p_n=p_{n-1}+\frac n2=\frac{n(n+1)}{4},$$as desired. $\blacksquare$
Z K Y
The post below has been deleted. Click to close.
This post has been deleted. Click here to see post.
bryanguo
1032 posts
#106 • 1 Y
Y by aryabhata000
solved and wrote up (a) before (b) but whatever

We first prove that the procedure always terminates. For sake of convenience, denote $H=1$ and $T=0.$ We use induction to prove that for any starting position, the sequence eventually becomes a string of $1$'s followed by a string of $0$'s, which obviously terminates.

For base case $n=1,$ this is obviously true. For inductive step, assume that for $k$ coins, our hypothesis holds. Then, for $k+1$ coins, consider cases depending on if the first coin is $1$ or $0.$

If the first coin is $1,$ the rest of the coins represent a binary string of length $k,$ which from inductive assumption we know terminates. In this situation, all the indices of the binary string of length $k$ after the first coin flip are shifted up by $1,$ but since the first coin is has value $1,$ the total number of $1$'s is also shifted up by $1.$ Therefore, the procedure in this case will be identical to the procedure for the binary string of length $k$ until the final step, at which we change the $1$ at the first index to a $0.$ Thus all strings starting with a $1$ terminate.

If the first coin is $0,$ insert a coin in front of the first coin with label $1.$ Then, since we know any string starting with a $1$ terminates with the string after the initial $1$ terminating completely first, it follows the string of length $k$ after the $1$ we inserted must terminate, as desired. It remains to determine the average number of steps it takes for the process to terminate, over all configurations.

Let $f(n)$ be the average number of steps it takes for the process to terminate for a string of length $n.$ We prove that $f(n)=\tfrac{n(n+1)}{4}.$ Testing the first few cases, we find $f(1)=\tfrac{1}{2},$ $f(2)=\tfrac{3}{2},$ $f(3)=3,$ $f(4)=5.$ Proceed by strong induction. The base cases are evident as described. For inductive step, assume $f(k)=\tfrac{k(k+1)}{4}$ for a string of length $k.$ Then we prove that this holds for $k+1.$ Consider the following cases:
  • If the sequence starts with a $1,$ then we have $f(k)+1.$
  • If the sequence ends with a $0,$ then we have $f(k).$
  • If the sequence starts with a $1$ and ends with a $0,$ observe the trailing zero once again does not do anything. The starting $1$ also does not change the operations performed of the $k$ trailing digits. Then in this case we have $f(k-1)+1,$ where the $+1$ accounts for the deletion of the initial $1.$
  • If the sequence starts with a $0$ and ends with a $1,$ observe that the final $1$ shifts the total number of $1$'s in the string up by $1,$ and the initial $0$ increases the index of each of the trailing $k$ numbers by $1.$ Thus it takes $f(k-1)$ operations to get to a string of $0$'s followed by one $1.$ At this point, it takes $2k+1$ operations to complete the process. Thus in this case we have $f(k-1)+2k+1.$
Now, by Principle of Inclusion and Exclusion,
\begin{align*}
f(k+1)&=\frac{1}{2}(f(k)+1)+\frac{1}{2}(f(k))-\frac{1}{4}(f(k-1)+1)+\frac{1}{4}(f(k-1)+2k+1) \\
&=\frac{(k+1)(k+2)}{4}.
\end{align*}This completes the induction.
Z K Y
The post below has been deleted. Click to close.
This post has been deleted. Click here to see post.
Mr.Sharkman
510 posts
#107
Y by
I thought this one was really nice
Solution
Z K Y
The post below has been deleted. Click to close.
This post has been deleted. Click here to see post.
shendrew7
806 posts
#108
Y by
Let the desired answer be $a_n$, which we will prove is $\boxed{\frac{n^2+n}{4}}$ through recursion. Note the ending state contains all tails.

It's easy to get the base cases $a_1 = \tfrac 12$ and $a_2 = \tfrac 32$. For $n \ge 3$, we use casework on the first and last flip:
  • Starts with $T$, ends in $H$: Notice the ending $H$ will never be changed until all $n$ flips are heads, and this stage cannot be reached without changing the leading $T$, which only occurs after we have the middle $n-2$ flips are tails as well. Hence we expect $a_{n-2} + (n-1) + n$ moves in this case.
  • Start with $H$ or ends in $T$: We use PIE:
    • Starts with $H$: Never gets changed until the final $n-1$ coins are tails, so we expect $a_{n-1}+1$.
    • Ends with $T$: Never changed, so we expect $a_{n-1}$.
    • Both: Using our previous analysis, we get $a_{n-2}+1$.
Thus our recursion is
\[a_n = \tfrac 14 (a_{n-2} + 2n-1) + \tfrac 12 (a_{n-1}+1) + \tfrac 12 (a_{n-1}) - \tfrac 14 (a_{n-2}+1) = a_{n-1} + \tfrac n2,\]
from which we get the desired closed form. $\blacksquare$
Z K Y
The post below has been deleted. Click to close.
This post has been deleted. Click here to see post.
blueprimes
363 posts
#109
Y by
Not sure if anybody posted this sol yet

Let $E(n)$ be the wanted expected value, we claim that $E(n) = \dfrac{n(n +1)}{4}$ which is indeed finite and thus all configurations terminate. To prove this we induct, the $n = 1$ case is obvious. Now assume this is true for some $n$, and consider an arbitrary sequence $S$ of $n + 1$ coins.

First, we consider when the last coin is $H$. Define the inverse of a row of coins as the result when all the coins are flipped and orderly reversed. For example, the inverse of $HHTTH$ is $THHTT$. Moreover, let $P$ be the sequence of the first $n$ coins in $S$, and let $Q$ be the inverse of $P$.

If a row of coins has $h$ instances of $H$, define the operations:
  • $f$ as flipping coin $h$.
  • $g$ as flipping coin $h + 1$.
Claim 1: $g^k(P)$ is the inverse of $f^k(Q)$ for all positive integers $k$.

Proof. It is sufficient to show for $k = 1$, then iterating yields the claim. Suppose $P$ has $h$ instances of $H$. Then $g(P)$ has coin $h + 1$ flipped, so its inverse is $Q$ with coin $(n + 1) - (h + 1) = n - h$ flipped. On the other hand, $Q$ has $n - h$ instances of $H$, so $f(Q)$ is $T$ with coin $n - h$ flipped. So they are equivalent.

Note that the inverse of the all-$H$ sequence is all-$T$. Now since applying $f$ onto $S$ applies $g$ onto $P$, the number of steps until it reaches the all-$H$ sequence is the same number of steps for $Q$ to reach the all-$T$ sequence. But since the inverse property is a bijection, the average number of such steps is simply $E_n$. There is an additional $n + 1$ steps to go from all-$H$ to all-$T$, so the cumulative expected value is $E_n + n + 1$.

Now when the last coin is $T$, it is unaffected, and the average over all these cases is $E_n$.

We have $E_{n + 1} = \dfrac{1}{2}(E_n + E_n + n + 1) = \dfrac{n(n + 1)}{4} + \dfrac{n + 1}{2} = \dfrac{(n + 1)(n + 2)}{4}$ and the induction is finished. It is done.
This post has been edited 1 time. Last edited by blueprimes, Sep 19, 2024, 7:19 PM
Z K Y
The post below has been deleted. Click to close.
This post has been deleted. Click here to see post.
Maximilian113
577 posts
#110
Y by
We claim that the expected value of $L(C)$ over all possible $2^n$ configurations of length $n$ is $\frac{n(n+1)}{4}.$ This will prove part (a), since this value is finite, and finish part (b).

Let $e_n$ be the answer for $n,$ $a_n$ be the expected value for a sequence of length $n$ that ends with $H,$ and let $b_n$ be the expected value for a sequence of length $n$ that ends with $T.$ Observe that $e_n = \frac12 (a_n+b_n).$

Now, suppose that $n \geq 3.$ Clearly, from $a_n,$ if the sequence starts at $T$ we can ignore the front and end of the sequence temporarily, and we will have on average $e_{n-2}$ operations for the middle $n-2$ coins to all become $T.$ Then, there will be $n-1$ operations to turn everything into $H,$ and finally $n$ operations to turn everything into $T.$

However, if from $a_n$ the sequence starts at $H,$ we are essentially in the same situation as $a_{n-1}$ by ignoring the first coin, and then one more operation is required to turn the first $H$ into a $T.$ Therefore, $$a_n=\frac12 (e_{n-2}+2n-1)+\frac12 (a_{n-1}+1) = n+\frac12 e_{n-2}+\frac12 a_{n-1}.$$
Meanwhile, it is easy to see that $$b_n=e_{n-1}=\frac12 (a_{n-1}+b_{n-1}).$$
Therefore, $$a_n = n+\frac12(b_{n-1}+a_{n-1})=n+b_n.$$Substituting this into the second recurrence yields $$2b_n = n-1+2b_{n-1} \implies 2(b_n-b_{n-1})=n-1 \implies 2(b_n-b_2)=(n-1)+(n-2)+\cdots+3+2.$$Therefore, $b_n=\frac{n(n-1)}{4}.$ From here, we have that $$e_n=\frac12(n+2b_n)=\frac{n(n+1)}{4}.$$QED
Z K Y
The post below has been deleted. Click to close.
This post has been deleted. Click here to see post.
cursed_tangent1434
661 posts
#111
Y by
First we compute the following.

Claim : The number of steps it takes to reach $n$ $T$s from $n$ $H$s is exactly $n$.

Proof : Clearly in each step the last coin turns from $H$ to $T$ since we go from $n$ $H$s to $0$ $H$s. Thus, we take
\[HH\dots H \longrightarrow HH \longrightarrow \dots 
    \longrightarrow HT \dots H TT\dots T \longrightarrow TT\dots TT\]which takes exactly $n$ steps.

Let $M_n$ be the total sum of steps taken over all $2^n$ configurations to terminate. Now, we make the following key observation.

Claim : Let a certain pair of configurations of coins
\[[a_1 a_2 \dots a_{n-1}] H - [b_1 b_2 \dots b_{n-1}] T 
    \]be \textit{paired} if for the first $n-1$ coins, exactly one of $a_i$ and $b_{n-i-1}$ is tails and the other is head.
We claim that a pair of configurations remain paired until they reach their ends states (all $H$ and all $T$).


Proof : Assume that before a certain move these are paired. Then, note that if we have $i$ Heads in $[a_1 a_2 \dots a_{n-1}] H$, then we have $n-1-i$ Heads in $[b_1 b_2 \dots b_{n-1}]T$ since the places which have Heads in the configuration paired to this has exactly $i-1$ Heads in places besides the final $H$.
Now, note that this means we swap Position $i$ and Position $n-i-1$ respectively of these two configurations. Thus, before the move one of Position $i$ and $n-i-1$ had Heads and the other Tails, and since both are flipped in this move, this condition still holds.

Thus, if a pair of configurations is paired (each configuration has a unique partner obviously) then they are paired until the end.

Thus, the sum of moves taken by the configurations ending with $H$ to reach the all $H$ state is equal to the total number of moves taken by a configuration ending with $T$ to reach the all $T$ state. But this is just $M_{n-1}$. Thus, the sum of steps taken by all configurations ending with $H$ to reach all $H$ is $M_{n-1}$ from which each configuration takes exactly $n$ steps to finish. Thus, we have a total number of steps,
\[M_{n} = 2M_{n-1} + 2^{n-1}n\]Now, it is easy to show that this recursion gives,
\[M_n = 2^{n-2}n(n+1)\]which makes the average number of steps taken to reach termination state over all $2^n$ configurations just
\[\frac{2^{n-2}n(n+1)}{2^n} = \boxed{\frac{n^2+n}{4}}\]
Z K Y
The post below has been deleted. Click to close.
This post has been deleted. Click here to see post.
eg4334
639 posts
#112
Y by
Phenomenal problem, I think my solution is different than most above and very weird. I claim the answer is $\boxed{\frac{n(n+1)}{4}}$. In particular, I will prove that the sum of the number of operations over all permutations is $2^n \frac{n(n+1)}{4}$, implying the result.

\begin{tabular}{|c c c c c c c c|} 
 \hline
 HHH & HTH & THH & TTH & HHT & HTT & THT & TTT \\ [0.5ex] 
 \hline \hline

 HHT & HHH & TTH & HTH & HTT & TTT & HHT & . \\ 
 \hline
HTT & HHT & HTH & HHH & TTT & . & HTT & .\\
\hline
TTT & HTT & HHH & HHT & . & . & TTT & .\\
\hline
. & TTT & HHT & HTT & . & . & . & . \\
\hline
. & . & HTT & TTT & . & . & . & . \\
\hline
. & . & TTT & . & . & . &. & . \\
\hline
\end{tabular}Above is the table for $n=3$. I proceed part a) using induction. The base case is trivial. If the statement is true for $k$, then it is obviously true for $k+1$ in the cases where the string ends in a tail. This is because we can essentially ignore the tail at the end, reducing the string length down. For the ones that end in heads, looking at the table above we can make a crucial claim.

Claim: For those which end in heads, the string of sequences until it reaches the all heads position is identical to one that ends in tails except with the sequences flipped across the midpoint of the sequence not including the last head and with the heads and tails inverted.
Proof: First let's understand the claim. Looking at $TTH$ for example, the sequence up to $HHH$ is identical to that of $HHT$ up to $TTT$. However, every step of the sequence has the last coin different and the rest are reflected and inverted. For example, the $TT$ are related to $HH$ in that the $TT$ is reflected across its midline to obtain $TT$ and then the heads and tails are inverted. As another example, notice how $TTHH$ gets to $HHHH$ in the same amount of flips as $THHT$ gets to $TTTT$. This is because $TTH$ reflected is $HTT$ which when inverted becomes $THH$ as desired. To prove this claim, let there be $x$ heads in the first $n-1$ coins of a sequence that ends in heads. Then, the resulting sequence will consist of us flipping the $x+1$th coin. In our reflected and inverted sequence that ends in tails, we now have $n-1-x$ heads. Thus, we flip the $n-1-x$th coin. Note that these are symmetric across the midline because $$(x+1) + (n-1-x) = (1) + (n-1)$$Therefore, the sequences will always be analogous under our reflection and inversion and thus will take the same amount of flips to get to either all $H$ or all $T$'s. $\blacksquare$

Now part a) is done, because we have shown that all of the ones ending in $H$ get to all $H$ in the same amount of flips that the ones ending in $T$ take to get to all $T$ (note that once we get all $H$ the sequence is guarenteed to finish). To do part b), proceed again by induction: $$\frac{n(n+1)}{4} \cdot 2^n \cdot 2 + 2^n (n+1) = 2^{n+1} \cdot \frac{(n+1)(n+2)}{4}$$is an identity so we are done (the $n+1$ comes from the $n+1$ moves it takes to get from all $H$ to the finish).
This post has been edited 1 time. Last edited by eg4334, Dec 23, 2024, 10:04 PM
Z K Y
The post below has been deleted. Click to close.
This post has been deleted. Click here to see post.
abeot
129 posts
#113 • 1 Y
Y by centslordm
I claim the answer is $\frac{n(n+1)}{4}$, which would imply the process terminates. We prove the statement by induction on $n$, with base case $n = 1$ trivial. Hence, assume $n \ge 2$, and let the answer for $n = m$ be $a_m$, so $a_1 = \frac 12$.

Claim: If the last coin is $T$, then the process takes an average of $a_{n-1}$ steps.
Proof. The last coin never changes; if it could, take the first instance it becomes heads. Then the previous instance must have had $n$ heads, contradicting minimality of time.
Thus, by the inductive hypothesis on the first $n-1$ coins, it takes $a_{n-1}$ steps, on average. $\square$

Claim: If the last coin is $H$, then the process takes $a_{n-1} + n$ steps.
Proof. Let the first $n-1$ coins be $s_1$, $s_2$, \dots, $s_{n-1}$. Define coins $t_1$, $t_2$, \dots, $t_{n-1}$ such that \[ s_i \text{ heads } \iff t_{n-1-i} \text{ for all } i = 1, 2, \dots, n-1 \](in the natural binary string interpretation, $t$ is the reverse of the inverse of $s$).
The key fact is that the condition on the $t_i$ holds. Suppose initially, $k$ of the $s_i$ are heads. Then $n-1-k$ of the $t_i$ are heads. Flip $s_k$ and $t_{n-1-k}$, and note
  • If $s_k$ was heads then $t_{n-1-k}$ was tails. After the flip, then $s_k$ is tails and $t_{n-1-k}$ is heads.
  • If $s_k$ has tails then $t_{n-1-k}$ was heads. After the flip, then $s_k$ is heads and $t_{n-1-k}$ is tails.
Either way, the condition still holds. Yet we know the $t_i$ must become all tails with an average of $a_{n-1}$ steps, by the inductive hypothesis. Thus, the $s_i$ become all heads in an average of $a_{n-1}$ steps.
So after an average of $a_{n-1}$ steps, all the $n$ coins are heads. Clearly after $n$ steps then we arrive at all tails. $\square$

We finish by computing \[ a_n = \frac{2^{n-1}}{2^n}\cdot a_{n-1} + \frac{2^{n-1}}{2^n}\cdot (a_{n-1} + n) = a_{n-1} + \frac n2 = \frac{n(n+1)}{4} \]as desired. $\blacksquare$
Z K Y
The post below has been deleted. Click to close.
This post has been deleted. Click here to see post.
mathwiz_1207
105 posts
#114
Y by
Beautiful problem! We will first prove that no matter what, the sequence terminates. Define a reduction as a sequence of moves which changes exactly $1$ coin with $H$ facing up into a $T$ facing up, and leaving all other coins unchanged. We claim that in any situation, if there are $k > 0$ coins, we can obtain a reduction.

Define a coin at index $i$ to be the $i^{th}$ coin from the left. Assume that the coins with $H$ facing upwards have indices of $a_1, a_2, \dots a_k$, with
\[a_1 < a_2 < \dots < a_k\]Obviously, $a_k \geq k$. Now, let $a_j$ be the index of the first coin with $H$ facing upwards and $a_j \geq k$ (there exists such a $j$ since $a_k \geq k$). If $a_j = k$, we immediately obtain a reduction. Otherwise, consider the following sequence of moves (the subscript represents the index of the coin):
\[\underset{k}TT\dots T \underset{a_j}{H} \rightarrow \underset{k}HT\dots T \underset{a_j}{H}\]Now, there are $k + 1$ coins, so we move from
\[\underset{k}HT\dots T \underset{a_j}{H} \rightarrow \underset{k}{H}H\dots T \underset{a_j}H\]We see that we keep doing the moves until we reach
\[\underset{k}HH\dots H \underset{a_j}{H}\]this takes ($a_j - k$) steps. There are now $k + a_j - k = a_j$ coins with $H$ facing up, so doing a move now gets us to
\[\underset{k}HH\dots H \underset{a_j}{H} \rightarrow \underset{k}HH\dots H\underset{a_j}T\]Once again, we can keep doing moves until we eventually reach
\[\underset{k}TT\dots T \underset{a_j}{T}\]which takes $(a_j - k + 1)$ moves. Therefore, this sequence of moves is indeed a reduction, since the coin at index $a_j$ is now a $T$, and everything else is unchanged. Therefore, we can keep performing reductions until we obtain no heads.

Now, notice that each reduction takes exactly $2(a_j - k) + 1$ moves if there are $k$ coins. Therefore, to go from $k$ coins to $0$ coins, it would take
\[2\sum_{i=1}^k a_ i - 2\sum_{i=1}^ki + k = 2\sum_{i=1}^k a_ i - k^2\]steps. Now, if we sum this over all possible arrangements of the $k$ coins, this is equal to
\[-k^2 \cdot {n \choose k} + 2\sum_{i=1}^ni\cdot{n - 1 \choose k - 1} = n(n+1){n -1 \choose k-1} - nk{n-1 \choose k-1}\]Now, summing this over all possible values of $k$,
\begin{align*}
S &= \sum_{k=0}^n\left[n(n+1){n -1 \choose k-1} - nk{n-1 \choose k-1}\right] \\
&= n(n+1)\sum_{k=0}{n-1 \choose k -1} - n \sum_{k=0}^n k {n-1 \choose k - 1} \\
&= n(n+1)\cdot 2^{n-1} - n \cdot (n+1)2^{n-2} \\
&= n(n+1) \cdot 2^{n-2} \\
\end{align*}Therefore, the average number of steps over all $2^n$ sequences is
\[\frac{n(n+1) \cdot 2^{n-2}}{2^n} = \boxed{\frac{n(n+1)}{4}}\]and we are done.
This post has been edited 1 time. Last edited by mathwiz_1207, Jan 15, 2025, 2:44 AM
Reason: formatting
Z K Y
The post below has been deleted. Click to close.
This post has been deleted. Click here to see post.
zuat.e
77 posts
#115
Y by
Solution: $\frac{n(n+1)}{4}$

For the first part, suppose we are making moves until we have to turn a $T$ into an $H$ (if not we'd eventually have $HHHH \dots H$, which will clearly terminate).

After that first move, we will create a chain of $H$'s until we get to an $H$, which will be turned into a $T$ (say $k$ $H$'s), hence we will then have a chain of, at least, $k+1$ $T$'s until we reach a $T$ which will be transformed into an $H$ and so we will have a chain of at least $k+2$ $H$'s before we have to turn again...
The length of the chain is strictly increasing (except if it terminates), but its length must be $\leq n$, so it must terminate.

For the second part, we prove by induction that the average is $av=\frac{n(n+1)}{4}$. We have to prove some claims before though.

Claim 1: There're $n2^{n-1}$ $T$'s over all $2^n$ configurations

Proof:
There are $n2^n$ total letters and half are $T$'s and half are $H$'s, hence the total amount of $T$'s is $T_t=n2^{n-1}$.

Claim 2: Suppose we add a final $H$ to a configuration $c$, which takes $k$ steps, then this new configuration will take $k+2t+1$ steps, where $t$ is the amount of $T$'s on $c$. Moreover, if we instead add a final $T$, the amount of steps doesn't change
Proof:
We prove by strong induction on $n$ an stronger result actually: if we add $l$ $H$'s, then we will have $k+2tl+l$. If $l=1$, the claim is achieved. Case $n=2$ is trivial. Suppose it for $1,2, \dots ,n$ and consider a configuration of $(n+1)$ coins: if it finishes on an $H$, it will have $2t+1$ more than the previous $(n-1)$ configuration (which will have $(k-2t-1)$, hence in that $(n-1)$ config, we add $(l+1)$ $H$'s, getting $k+2tl+l$, as desired.
If it finishes on a $T$'s eliminate them all and suppose that config takes $k$ steps. We add an $l$ $H$, so we have $k+l+2l(t-1)$ as a result. Then, we add that remaining $T$ we removed before, which will increase the number of steps by 2 for each $H$, summing a total of $k+2lt+l$.
The adding of the 2 for each $H$ is due to the following:
Let's consider the first configuration without that $T$: Whenever it gets to the $H$ that will go in the place of the $T$, it will go down $a_1$ steps, then it will go up $a_1+1$ steps, then it will go down $a_2$ and up $a_2 +1$, \dots for all $l$ $H$'s, summing in that part up to $2\sum_{i=1}^la_i+l$.
Whilst in the other configuration (with the $T$), it happens the following. As the $T$ doesn't affect the number of $H$'s, it will all go in the same way until we reach that $T$, which will turn into an $H$, then the next $H$ will turn into an $H$ and go down $a_1+1$ steps and up $(a_1+1)+1=a_1+2$, again down $a_2+1$ and up $a_2+2$, summing in that part up to $2\sum_{i=1}^la_i+3l$, as desired.

Now, we finish by induction. Suppose $av(n)=\frac{n(n+1)}{4}$, then by adding a final $H$ and $T$ to every of the $2^n$ configurations, we have:
$av(n+1)=av(n)+\frac{2T_t+2^n}{2^{n+1}}=\frac{n(n+1)}{4}+\frac{n+1}{2}=\frac{(n+1)(n+2)}{4}$, as desired.
Z K Y
The post below has been deleted. Click to close.
This post has been deleted. Click here to see post.
de-Kirschbaum
204 posts
#116
Y by
Let $H=1, T=0$. We will prove that there exists a bijection between the cases where the rightmost bit is $1$ and the cases where the rightmost bit is $0$.

We will construct a operation preserving bijection $$f:\textrm{binary sequences of length n ending in 1} \to \textrm{binary sequences of length n starting with 0}$$
Where we flip each bit in the sequence and then flip the entire sequence, so for example $f(0001)=0111$ via $0001 \rightarrow 1110 \rightarrow 0111$. This $f$ is clearly invertible so this must be a bijection. Then, we perform the same process but we add 1 to the number of 1s in the sequence since we are now guaranteed a $0$ in the first digit. For example, $0111 \rightarrow 0110$. This modification is the same as the old process because all the 1s used to be 0s, so pretransformation we would flip the $s$th bit where $s$ is the number of 1s on the board and $s\leq n-1$. Now we are flipping the $n-s+1 \geq 1$st bit which is the $s$th bit counting from the right to left, and clearly flipping the sequence back once again shows these are the same bits. The first bit (which is the last bit pretransform) is never touched.

Noting these two properties we can prove that this bijection is indeed operation preserving: ie $A \rightarrow B \iff f(A) \rightarrow f(B)$. If $A \rightarrow B$, suppose the $s$th bit gets flipped, then we know that from $f(A) \rightarrow f(B)$ the $n-s+1$st bit gets flipped. Undoing the transformation we see that $f(B)$ gives $B$. Similarly, we can see that if $f(A) \rightarrow f(B)$ then $A \rightarrow B$.

Note that $f(1111\ldots 1)=000\ldots0$ and the new process is just the old process on the $n-1$ bits indexed from 2 to $n$.

Now we will prove termination. Let us denote directed graph $G_i$ as the states tree of this process for $i$ coins. Drawing the base case $i=1,2$ we see that the $G_1, G_2$ are cycleless.

Assume that $G_k$ is cycleless. Then $G_{k+1}$ has two components. The states tree for all sequences ending in 0 is the same as the states tree $G_{k}$ just with a 0 tagged onto the end of each vertex. The states tree of the transformed sequences ending in 1 is the same as the states tree $G_{k}$ just with a 0 tagged onto the beginning of each vertex. The two states trees are glued together via $1111\ldots 11 \rightarrow 1111 \ldots 10$. Thus, $G_k$ is cycleless.

This means that if we let $E[X_n]$ be the expected number of steps taken for $n$ coins, then $$E[X_n]=\frac{2E[X_{n-1}]+n}{2}=E[X_{n-1}]+\frac{n}{2}$$with $E[X_1]=\frac{1}{2}$ so substituting we see that $E[X_n]=\frac{n(n+1)}{4}$.
Z K Y
The post below has been deleted. Click to close.
This post has been deleted. Click here to see post.
Ihatecombin
69 posts
#117
Y by
It suffices to prove part (b) by itself, since solving part (b) will clearly imply part (a). We claim the average value of \(L(c)\) is equal to \(\frac{n(n+1)}{4}\). We shall prove this by strong induction, the base case where \(n=1\) and \(n=2\) is easy. Define \(S(n)\) as the sum over all of the \(2^n\) \(L(c)\) where \(C\) is some arbitrary configuration of length \(n\).

We shall split the \(2^n\) configurations into \(3\) cases, namely the case where \(T\) is at the end of the configuration, the case where the first letter is \(H\) and the last \(H\), finally the case where \(T\) is the first letter and \(H\) is at the end.

Claim 1: The sum of the \(L(C)\) for the case where \(C\) has \(T\) as a final letter is \(S(n-1)\)
Proof:
Obvious. Since the amount of \(H\) can be at most \(n-1\), the final coin will never be flipped. $\blacksquare$

Claim 2: The sum of the \(L(C)\) for the case where \(C\) has \(H\) as a final letter and \(H\) as a first letter is \(S(n-1) - S(n-2) + 2^{n-2}\)
Proof:
Notice that the first coin will be flipped if and only if the state of the configuration is all \(T\). So we can just ignore the \(H\) as the first letter momentarily and
sum of moves that is required over all the \(2^{n-1}\) configurations of \(n-1\) coins with the final coin being \(H\).

We know that the sum of scores where the final coin is arbitrary is \(S(n-1)\), we also know that the sum of moves that is required over all the \(2^{n-1}\) configurations of \(n-1\) coins with the final coin being \(T\) is \(S(n-2)\) (since we can just ignore that final coin).
Thus the sum of the moves that is required over all the \(2^{n-1}\) configurations of \(n-1\) coins with the final coin being \(H\) is \(S(n-1) - S(n-2)\), notice that at this point all the coins will be of the form \(HTTT \cdots T\), thus they only require one move to make all the coins be flipped on the \(T\) side, since there are \(2^{n-2}\) coins which have \(H\) as a final and first letter, it follows that our claim is true. $\blacksquare$

Claim 3: The sum of the \(L(C)\) for the case where \(C\) has \(H\) as a final letter and \(T\) as a first letter is \(S(n-2) + 2^{n-2}(2n-1)\)
Proof:
Notice that the final letter can be converted into a \(T\) if the series of coins is of the form \(HHHH \cdots HHH\), since the only way for the first coin to be flipped is if the series of coins is of the form \(TTTTT \cdots TTH\), this motivates us to count the amount of moves required to make all initial configurations of coins in this class become of the form \(TTTTT \cdots TTH\). Since the final letter \(H\) will never be flipped before the coin becomes of the form \(TTTTT \cdots TTH\), it is obvious that this sum will be \(S(n-2)\).

Once a series of coins has reached this point, there are clearly exactly \(2n-1\) moves needed for it to become \(TTT \cdots TTT\) since all the coins have to become \(H\) first and then they have to become all \(T\), this requires exactly \(2n-1\) moves from this position, note that there are \(2^{n-2}\) coins in this class.
Thus the total sum for series of this class is \(S(n-2) + 2^{n-2}(2n-1)\). $\blacksquare$

Thus by summing the results of our \(3\) claims we obtain a recursion relation for our function \(S(n)\), which is
\[S(n) = 2S(n-1) + 2^{n-2} \cdot 2n \ \text{for all \(n \geq 3\)}, \quad S(1) = 1, \quad S(2) = 6\]It is easy to see that this is fulfilled by \(S(n) = (n)(n+1) \cdot 2^{n-2}\).
Z K Y
N Quick Reply
G
H
=
a