You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: src/game_theory/sprague-grundy-nim.md
+22-22Lines changed: 22 additions & 22 deletions
Original file line number
Diff line number
Diff line change
@@ -73,27 +73,27 @@ Then the proof splits into two parts:
73
73
if for the current position the xor-sum $s = 0$, we have to prove that this state is losing, i.e. all reachable states have xor-sum $t \neq 0$.
74
74
If $s \neq 0$, we have to prove that there is a move leading to a state with $t = 0$.
75
75
76
-
* Let $s = 0$ and let's consider any move.
77
-
This move reduces the size of a pile $x$ to a size $y$.
78
-
Using elementary properties of $\oplus$, we have
79
-
80
-
$$t = s \oplus x \oplus y = 0 \oplus x \oplus y = x \oplus y$$
81
-
82
-
Since $y < x$, $y \oplus x$ can't be zero, so $t \neq 0$.
83
-
That means any reachable state is a winning one (by the assumption of induction), so we are in a losing position.
84
-
85
-
* Let $s \neq 0$.
86
-
Consider the binary representation of the number $s$.
87
-
Let $d$ be the number of its leading (biggest value) non-zero bit.
88
-
Our move will be on a pile whose size's bit number $d$ is set (it must exist, otherwise the bit wouldn't be set in $s$).
89
-
We will reduce its size $x$ to $y = x \oplus s$.
90
-
All bits at positions greater than $d$ in $x$ and $y$ match and bit $d$ is set in $x$ but not set in $y$.
91
-
Therefore, $y < x$, which is all we need for a move to be legal.
92
-
Now we have:
93
-
94
-
$$t = s \oplus x \oplus y = s \oplus x \oplus (s \oplus x) = 0$$
95
-
96
-
This means we found a reachable losing state (by the assumption of induction) and the current state is winning.
76
+
*Let $s = 0$ and let's consider any move.
77
+
This move reduces the size of a pile $x$ to a size $y$.
78
+
Using elementary properties of $\oplus$, we have
79
+
80
+
\[t = s \oplus x \oplus y = 0 \oplus x \oplus y = x \oplus y\]
81
+
82
+
Since $y < x$, $y \oplus x$ can't be zero, so $t \neq 0$.
83
+
That means any reachable state is a winning one (by the assumption of induction), so we are in a losing position.
84
+
85
+
*Let $s \neq 0$.
86
+
Consider the binary representation of the number $s$.
87
+
Let $d$ be the number of its leading (biggest value) non-zero bit.
88
+
Our move will be on a pile whose size's bit number $d$ is set (it must exist, otherwise the bit wouldn't be set in $s$).
89
+
We will reduce its size $x$ to $y = x \oplus s$.
90
+
All bits at positions greater than $d$ in $x$ and $y$ match and bit $d$ is set in $x$ but not set in $y$.
91
+
Therefore, $y < x$, which is all we need for a move to be legal.
92
+
Now we have:
93
+
94
+
\[t = s \oplus x \oplus y = s \oplus x \oplus (s \oplus x) = 0\]
95
+
96
+
This means we found a reachable losing state (by the assumption of induction) and the current state is winning.
97
97
98
98
**Corollary.**
99
99
Any state of Nim can be replaced by an equivalent state as long as the xor-sum doesn't change.
@@ -124,7 +124,7 @@ The number $x$ is called the Grundy value or nim-value of state $v$.
124
124
125
125
Moreover, this number can be found in the following recursive way:
126
126
127
-
$$ x = \text{mex}\ \\{ x_1, \ldots, x_k \\}, $$
127
+
$$ x = \text{mex}\ \{ x_1, \ldots, x_k \}, $$
128
128
129
129
where $x_i$ is the Grundy value for state $v_i$ and the function $\text{mex}$ (*minimum excludant*) is the smallest non-negative integer not found in the given set.
Finally we should remark on processing all the additions of $1$ or $-1$ on all stripes in $[x_1, x_2]$. For each addition of $w$ on $[x_1, x_2]$ we can create events $(x_1, w),\ (x_2, -w)$
Copy file name to clipboardExpand all lines: src/graph/bellman_ford.md
+1-1Lines changed: 1 addition & 1 deletion
Original file line number
Diff line number
Diff line change
@@ -224,7 +224,7 @@ There are some care to be taken in the implementation, such as the fact that the
224
224
To avoid this, it is possible to create a counter that stores how many times a vertex has been relaxed and stop the algorithm as soon as some vertex got relaxed for the $n$-th time.
225
225
Note, also there is no reason to put a vertex in the queue if it is already in.
Copy file name to clipboardExpand all lines: src/graph/bridge-searching-online.md
+55-53Lines changed: 55 additions & 53 deletions
Original file line number
Diff line number
Diff line change
@@ -70,59 +70,61 @@ Each 2-edge-connected component will store the index `par[]` of its ancestor in
70
70
71
71
We will now consistently disassemble every operation that we need to learn to implement:
72
72
73
-
* Check whether the two vertices lie in the same connected / 2-edge-connected component.
74
-
It is done with the usual DSU algorithm, we just find and compare the representatives of the DSUs.
75
-
76
-
* Joining two trees for some edge $(a, b)$.
77
-
Since it could turn out that neither the vertex $a$ nor the vertex $b$ are the roots of their trees, the only way to connect these two trees is to re-root one of them.
78
-
For example you can re-root the tree of vertex $a$, and then attach it to another tree by setting the ancestor of $a$ to $b$.
79
-
80
-
However the question about the effectiveness of the re-rooting operation arises:
81
-
in order to re-root the tree with the root $r$ to the vertex $v$, it is necessary to necessary to visit all vertices on the path between $v$ and $r$ and redirect the pointers `par[]` in the opposite direction, and also change the references to the ancestors in the DSU that is responsible for the connected components.
82
-
83
-
Thus, the cost of re-rooting is $O(h)$, where $h$ is the height of the tree.
84
-
You can make an even worse estimate by saying that the cost is $O(\text{size})$ where $\text{size}$ is the number of vertices in the tree.
85
-
The final complexity will not differ.
86
-
87
-
We now apply a standard technique: we re-root the tree that contains fewer vertices.
88
-
Then it is intuitively clear that the worst case is when two trees of approximately equal sizes are combined, but then the result is a tree of twice the size.
89
-
This does not allow this situation to happen many times.
90
-
91
-
In general the total cost can be written in the form of a recurrence:
$T(n)$ is the number of operations necessary to obtain a tree with $n$ vertices by means of re-rooting and unifying trees.
94
-
A tree of size $n$ can be created by combining two smaller trees of size $k$ and $n - k$.
95
-
This recurrence is has the solution $T(n) = O (n \log n)$.
96
-
97
-
Thus, the total time spent on all re-rooting operations will be $O(n \log n)$ if we always re-root the smaller of the two trees.
98
-
99
-
We will have to maintain the size of each connected component, but the data structure DSU makes this possible without difficulty.
100
-
101
-
* Searching for the cycle formed by adding a new edge $(a, b)$.
102
-
Since $a$ and $b$ are already connected in the tree we need to find the [Lowest Common Ancestor](lca.md) of the vertices $a$ and $b$.
103
-
The cycle will consist of the paths from $b$ to the LCA, from the LCA to $b$ and the edge $a$ to $b$.
104
-
105
-
After finding the cycle we compress all vertices of the detected cycle into one vertex.
106
-
This means that we already have a complexity proportional to the cycle length, which means that we also can use any LCA algorithm proportional to the length, and don't have to use any fast one.
107
-
108
-
Since all information about the structure of the tree is available is the ancestor array `par[]`, the only reasonable LCA algorithm is the following:
109
-
mark the vertices $a$ and $b$ as visited, then we go to their ancestors `par[a]` and `par[b]` and mark them, then advance to their ancestors and so on, until we reach an already marked vertex.
110
-
This vertex is the LCA that we are looking for, and we can find the vertices on the cycle by traversing the path from $a$ and $b$ to the LCA again.
111
-
112
-
It is obvious that the complexity of this algorithm is proportional to the length of the desired cycle.
113
-
114
-
* Compression of the cycle by adding a new edge $(a, b)$ in a tree.
115
-
116
-
We need to create a new 2-edge-connected component, which will consist of all vertices of the detected cycle (also the detected cycle itself could consist of some 2-edge-connected components, but this does not change anything).
117
-
In addition it is necessary to compress them in such a way that the structure of the tree is not disturbed, and all pointers `par[]` and two DSUs are still correct.
118
-
119
-
The easiest way to achieve this is to compress all the vertices of the cycle to their LCA.
120
-
In fact the LCA is the highest of the vertices, i.e. its ancestor pointer `par[]` remains unchanged.
121
-
For all the other vertices of the loop the ancestors do not need to be updated, since these vertices simply cease to exists.
122
-
But in the DSU of the 2-edge-connected components all these vertices will simply point to the LCA.
123
-
124
-
We will implement the DSU of the 2-edge-connected components without the Union by rank optimization, therefore we will get the complexity $O(\log n)$ on average per query.
125
-
To achieve the complexity $O(1)$ on average per query, we need to combine the vertices of the cycle according to Union by rank, and then assign `par[]` accordingly.
73
+
* Check whether the two vertices lie in the same connected / 2-edge-connected component.
74
+
It is done with the usual DSU algorithm, we just find and compare the representatives of the DSUs.
75
+
76
+
* Joining two trees for some edge $(a, b)$.
77
+
Since it could turn out that neither the vertex $a$ nor the vertex $b$ are the roots of their trees, the only way to connect these two trees is to re-root one of them.
78
+
For example you can re-root the tree of vertex $a$, and then attach it to another tree by setting the ancestor of $a$ to $b$.
79
+
80
+
However the question about the effectiveness of the re-rooting operation arises:
81
+
in order to re-root the tree with the root $r$ to the vertex $v$, it is necessary to necessary to visit all vertices on the path between $v$ and $r$ and redirect the pointers `par[]` in the opposite direction, and also change the references to the ancestors in the DSU that is responsible for the connected components.
82
+
83
+
Thus, the cost of re-rooting is $O(h)$, where $h$ is the height of the tree.
84
+
You can make an even worse estimate by saying that the cost is $O(\text{size})$ where $\text{size}$ is the number of vertices in the tree.
85
+
The final complexity will not differ.
86
+
87
+
We now apply a standard technique: we re-root the tree that contains fewer vertices.
88
+
Then it is intuitively clear that the worst case is when two trees of approximately equal sizes are combined, but then the result is a tree of twice the size.
89
+
This does not allow this situation to happen many times.
90
+
91
+
In general the total cost can be written in the form of a recurrence:
$T(n)$ is the number of operations necessary to obtain a tree with $n$ vertices by means of re-rooting and unifying trees.
96
+
A tree of size $n$ can be created by combining two smaller trees of size $k$ and $n - k$.
97
+
This recurrence is has the solution $T(n) = O (n \log n)$.
98
+
99
+
Thus, the total time spent on all re-rooting operations will be $O(n \log n)$ if we always re-root the smaller of the two trees.
100
+
101
+
We will have to maintain the size of each connected component, but the data structure DSU makes this possible without difficulty.
102
+
103
+
* Searching for the cycle formed by adding a new edge $(a, b)$.
104
+
Since $a$ and $b$ are already connected in the tree we need to find the [Lowest Common Ancestor](lca.md) of the vertices $a$ and $b$.
105
+
The cycle will consist of the paths from $b$ to the LCA, from the LCA to $b$ and the edge $a$ to $b$.
106
+
107
+
After finding the cycle we compress all vertices of the detected cycle into one vertex.
108
+
This means that we already have a complexity proportional to the cycle length, which means that we also can use any LCA algorithm proportional to the length, and don't have to use any fast one.
109
+
110
+
Since all information about the structure of the tree is available is the ancestor array `par[]`, the only reasonable LCA algorithm is the following:
111
+
mark the vertices $a$ and $b$ as visited, then we go to their ancestors `par[a]` and `par[b]` and mark them, then advance to their ancestors and so on, until we reach an already marked vertex.
112
+
This vertex is the LCA that we are looking for, and we can find the vertices on the cycle by traversing the path from $a$ and $b$ to the LCA again.
113
+
114
+
It is obvious that the complexity of this algorithm is proportional to the length of the desired cycle.
115
+
116
+
* Compression of the cycle by adding a new edge $(a, b)$ in a tree.
117
+
118
+
We need to create a new 2-edge-connected component, which will consist of all vertices of the detected cycle (also the detected cycle itself could consist of some 2-edge-connected components, but this does not change anything).
119
+
In addition it is necessary to compress them in such a way that the structure of the tree is not disturbed, and all pointers `par[]` and two DSUs are still correct.
120
+
121
+
The easiest way to achieve this is to compress all the vertices of the cycle to their LCA.
122
+
In fact the LCA is the highest of the vertices, i.e. its ancestor pointer `par[]` remains unchanged.
123
+
For all the other vertices of the loop the ancestors do not need to be updated, since these vertices simply cease to exists.
124
+
But in the DSU of the 2-edge-connected components all these vertices will simply point to the LCA.
125
+
126
+
We will implement the DSU of the 2-edge-connected components without the Union by rank optimization, therefore we will get the complexity $O(\log n)$ on average per query.
127
+
To achieve the complexity $O(1)$ on average per query, we need to combine the vertices of the cycle according to Union by rank, and then assign `par[]` accordingly.
Copy file name to clipboardExpand all lines: src/graph/bridge-searching.md
+1-1Lines changed: 1 addition & 1 deletion
Original file line number
Diff line number
Diff line change
@@ -24,7 +24,7 @@ Now we have to learn to check this fact for each vertex efficiently. We'll use "
24
24
25
25
So, let $tin[v]$ denote entry time for node $v$. We introduce an array $low$ which will let us check the fact for each vertex $v$. $low[v]$ is the minimum of $tin[v]$, the entry times $tin[p]$ for each node $p$ that is connected to node $v$ via a back-edge $(v, p)$ and the values of $low[to]$ for each vertex $to$ which is a direct descendant of $v$ in the DFS tree:
26
26
27
-
$$low[v] = \min \begin{cases} tin[v] \\\\ tin[p]& \text{ for all }p\text{ for which }(v, p)\text{ is a back edge} \\\ low[to]& \text{ for all }to\text{ for which }(v, to)\text{ is a tree edge} \end{cases}$$
27
+
$$low[v] = \min \begin{cases} tin[v] \\ tin[p]& \text{ for all }p\text{ for which }(v, p)\text{ is a back edge} \\ low[to]& \text{ for all }to\text{ for which }(v, to)\text{ is a tree edge} \end{cases}$$
28
28
29
29
Now, there is a back edge from vertex $v$ or one of its descendants to one of its ancestors if and only if vertex $v$ has a child $to$ for which $low[to] \leq tin[v]$. If $low[to] = tin[v]$, the back edge comes directly to $v$, otherwise it comes to one of the ancestors of $v$.
Copy file name to clipboardExpand all lines: src/graph/cutpoints.md
+1-1Lines changed: 1 addition & 1 deletion
Original file line number
Diff line number
Diff line change
@@ -24,7 +24,7 @@ Now we have to learn to check this fact for each vertex efficiently. We'll use "
24
24
25
25
So, let $tin[v]$ denote entry time for node $v$. We introduce an array $low[v]$ which will let us check the fact for each vertex $v$. $low[v]$ is the minimum of $tin[v]$, the entry times $tin[p]$ for each node $p$ that is connected to node $v$ via a back-edge $(v, p)$ and the values of $low[to]$ for each vertex $to$ which is a direct descendant of $v$ in the DFS tree:
26
26
27
-
$$low[v] = \min \begin{cases} tin[v] \\\\ tin[p] &\text{ for all }p\text{ for which }(v, p)\text{ is a back edge} \\\ low[to]& \text{ for all }to\text{ for which }(v, to)\text{ is a tree edge} \end{cases}$$
27
+
$$low[v] = \min \begin{cases} tin[v] \\ tin[p] &\text{ for all }p\text{ for which }(v, p)\text{ is a back edge} \\ low[to]& \text{ for all }to\text{ for which }(v, to)\text{ is a tree edge} \end{cases}$$
28
28
29
29
Now, there is a back edge from vertex $v$ or one of its descendants to one of its ancestors if and only if vertex $v$ has a child $to$ for which $low[to] < tin[v]$. If $low[to] = tin[v]$, the back edge comes directly to $v$, otherwise it comes to one of the ancestors of $v$.
Copy file name to clipboardExpand all lines: src/graph/depth-first-search.md
+28-28Lines changed: 28 additions & 28 deletions
Original file line number
Diff line number
Diff line change
@@ -26,34 +26,34 @@ For more details check out the implementation.
26
26
27
27
## Applications of Depth First Search
28
28
29
-
* Find any path in the graph from source vertex $u$ to all vertices.
30
-
31
-
* Find lexicographical first path in the graph from source $u$ to all vertices.
32
-
33
-
* Check if a vertex in a tree is an ancestor of some other vertex:
34
-
35
-
At the beginning and end of each search call we remember the entry and exit "time" of each vertex.
36
-
Now you can find the answer for any pair of vertices $(i, j)$ in $O(1)$:
37
-
vertex $i$ is an ancestor of vertex $j$ if and only if $\text{entry}[i] < \text{entry}[j]$ and $\text{exit}[i] > \text{exit}[j]$.
38
-
39
-
* Find the lowest common ancestor (LCA) of two vertices.
40
-
41
-
* Topological sorting:
42
-
43
-
Run a series of depth first searches so as to visit each vertex exactly once in $O(n + m)$ time.
44
-
The required topological ordering will be the vertices sorted in descending order of exit time.
45
-
46
-
47
-
* Check whether a given graph is acyclic and find cycles in a graph. (As mentioned above by counting back edges in every connected components).
48
-
49
-
* Find strongly connected components in a directed graph:
50
-
51
-
First do a topological sorting of the graph.
52
-
Then transpose the graph and run another series of depth first searches in the order defined by the topological sort. For each DFS call the component created by it is a strongly connected component.
53
-
54
-
* Find bridges in an undirected graph:
55
-
56
-
First convert the given graph into a directed graph by running a series of depth first searches and making each edge directed as we go through it, in the direction we went. Second, find the strongly connected components in this directed graph. Bridges are the edges whose ends belong to different strongly connected components.
29
+
* Find any path in the graph from source vertex $u$ to all vertices.
30
+
31
+
* Find lexicographical first path in the graph from source $u$ to all vertices.
32
+
33
+
* Check if a vertex in a tree is an ancestor of some other vertex:
34
+
35
+
At the beginning and end of each search call we remember the entry and exit "time" of each vertex.
36
+
Now you can find the answer for any pair of vertices $(i, j)$ in $O(1)$:
37
+
vertex $i$ is an ancestor of vertex $j$ if and only if $\text{entry}[i] < \text{entry}[j]$ and $\text{exit}[i] > \text{exit}[j]$.
38
+
39
+
* Find the lowest common ancestor (LCA) of two vertices.
40
+
41
+
* Topological sorting:
42
+
43
+
Run a series of depth first searches so as to visit each vertex exactly once in $O(n + m)$ time.
44
+
The required topological ordering will be the vertices sorted in descending order of exit time.
45
+
46
+
47
+
* Check whether a given graph is acyclic and find cycles in a graph. (As mentioned above by counting back edges in every connected components).
48
+
49
+
* Find strongly connected components in a directed graph:
50
+
51
+
First do a topological sorting of the graph.
52
+
Then transpose the graph and run another series of depth first searches in the order defined by the topological sort. For each DFS call the component created by it is a strongly connected component.
53
+
54
+
* Find bridges in an undirected graph:
55
+
56
+
First convert the given graph into a directed graph by running a series of depth first searches and making each edge directed as we go through it, in the direction we went. Second, find the strongly connected components in this directed graph. Bridges are the edges whose ends belong to different strongly connected components.
0 commit comments