Skip to content

Commit ab70a63

Browse files
committed
fix
1 parent 41d1df2 commit ab70a63

30 files changed

+241
-180
lines changed

src/game_theory/sprague-grundy-nim.md

Lines changed: 22 additions & 22 deletions
Original file line numberDiff line numberDiff line change
@@ -73,27 +73,27 @@ Then the proof splits into two parts:
7373
if for the current position the xor-sum $s = 0$, we have to prove that this state is losing, i.e. all reachable states have xor-sum $t \neq 0$.
7474
If $s \neq 0$, we have to prove that there is a move leading to a state with $t = 0$.
7575

76-
* Let $s = 0$ and let's consider any move.
77-
This move reduces the size of a pile $x$ to a size $y$.
78-
Using elementary properties of $\oplus$, we have
79-
80-
$$t = s \oplus x \oplus y = 0 \oplus x \oplus y = x \oplus y$$
81-
82-
Since $y < x$, $y \oplus x$ can't be zero, so $t \neq 0$.
83-
That means any reachable state is a winning one (by the assumption of induction), so we are in a losing position.
84-
85-
* Let $s \neq 0$.
86-
Consider the binary representation of the number $s$.
87-
Let $d$ be the number of its leading (biggest value) non-zero bit.
88-
Our move will be on a pile whose size's bit number $d$ is set (it must exist, otherwise the bit wouldn't be set in $s$).
89-
We will reduce its size $x$ to $y = x \oplus s$.
90-
All bits at positions greater than $d$ in $x$ and $y$ match and bit $d$ is set in $x$ but not set in $y$.
91-
Therefore, $y < x$, which is all we need for a move to be legal.
92-
Now we have:
93-
94-
$$ t = s \oplus x \oplus y = s \oplus x \oplus (s \oplus x) = 0$$
95-
96-
This means we found a reachable losing state (by the assumption of induction) and the current state is winning.
76+
* Let $s = 0$ and let's consider any move.
77+
This move reduces the size of a pile $x$ to a size $y$.
78+
Using elementary properties of $\oplus$, we have
79+
80+
\[ t = s \oplus x \oplus y = 0 \oplus x \oplus y = x \oplus y \]
81+
82+
Since $y < x$, $y \oplus x$ can't be zero, so $t \neq 0$.
83+
That means any reachable state is a winning one (by the assumption of induction), so we are in a losing position.
84+
85+
* Let $s \neq 0$.
86+
Consider the binary representation of the number $s$.
87+
Let $d$ be the number of its leading (biggest value) non-zero bit.
88+
Our move will be on a pile whose size's bit number $d$ is set (it must exist, otherwise the bit wouldn't be set in $s$).
89+
We will reduce its size $x$ to $y = x \oplus s$.
90+
All bits at positions greater than $d$ in $x$ and $y$ match and bit $d$ is set in $x$ but not set in $y$.
91+
Therefore, $y < x$, which is all we need for a move to be legal.
92+
Now we have:
93+
94+
\[ t = s \oplus x \oplus y = s \oplus x \oplus (s \oplus x) = 0 \]
95+
96+
This means we found a reachable losing state (by the assumption of induction) and the current state is winning.
9797

9898
**Corollary.**
9999
Any state of Nim can be replaced by an equivalent state as long as the xor-sum doesn't change.
@@ -124,7 +124,7 @@ The number $x$ is called the Grundy value or nim-value of state $v$.
124124

125125
Moreover, this number can be found in the following recursive way:
126126

127-
$$ x = \text{mex}\ \\{ x_1, \ldots, x_k \\}, $$
127+
$$ x = \text{mex}\ \{ x_1, \ldots, x_k \}, $$
128128

129129
where $x_i$ is the Grundy value for state $v_i$ and the function $\text{mex}$ (*minimum excludant*) is the smallest non-negative integer not found in the given set.
130130

src/geometry/vertical_decomposition.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -38,7 +38,7 @@ For simplicity we will show how to do this for an upper segment, the algorithm f
3838

3939
Here is a graphic representation of the three cases.
4040

41-
<center><img src="triangle_union.png" alt="Visual" width="90%"></center>
41+
<center>![Visual](triangle_union.png)</center>
4242

4343
Finally we should remark on processing all the additions of $1$ or $-1$ on all stripes in $[x_1, x_2]$. For each addition of $w$ on $[x_1, x_2]$ we can create events $(x_1, w),\ (x_2, -w)$
4444
and process all these events with a sweep line.

src/graph/bellman_ford.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -224,7 +224,7 @@ There are some care to be taken in the implementation, such as the fact that the
224224
To avoid this, it is possible to create a counter that stores how many times a vertex has been relaxed and stop the algorithm as soon as some vertex got relaxed for the $n$-th time.
225225
Note, also there is no reason to put a vertex in the queue if it is already in.
226226

227-
```cpp spfa
227+
```{.cpp file=spfa}
228228
const int INF = 1000000000;
229229
vector<vector<pair<int, int>>> adj;
230230

src/graph/bridge-searching-online.md

Lines changed: 55 additions & 53 deletions
Original file line numberDiff line numberDiff line change
@@ -70,59 +70,61 @@ Each 2-edge-connected component will store the index `par[]` of its ancestor in
7070

7171
We will now consistently disassemble every operation that we need to learn to implement:
7272

73-
* Check whether the two vertices lie in the same connected / 2-edge-connected component.
74-
It is done with the usual DSU algorithm, we just find and compare the representatives of the DSUs.
75-
76-
* Joining two trees for some edge $(a, b)$.
77-
Since it could turn out that neither the vertex $a$ nor the vertex $b$ are the roots of their trees, the only way to connect these two trees is to re-root one of them.
78-
For example you can re-root the tree of vertex $a$, and then attach it to another tree by setting the ancestor of $a$ to $b$.
79-
80-
However the question about the effectiveness of the re-rooting operation arises:
81-
in order to re-root the tree with the root $r$ to the vertex $v$, it is necessary to necessary to visit all vertices on the path between $v$ and $r$ and redirect the pointers `par[]` in the opposite direction, and also change the references to the ancestors in the DSU that is responsible for the connected components.
82-
83-
Thus, the cost of re-rooting is $O(h)$, where $h$ is the height of the tree.
84-
You can make an even worse estimate by saying that the cost is $O(\text{size})$ where $\text{size}$ is the number of vertices in the tree.
85-
The final complexity will not differ.
86-
87-
We now apply a standard technique: we re-root the tree that contains fewer vertices.
88-
Then it is intuitively clear that the worst case is when two trees of approximately equal sizes are combined, but then the result is a tree of twice the size.
89-
This does not allow this situation to happen many times.
90-
91-
In general the total cost can be written in the form of a recurrence:
92-
$$T(n) = \max_{k = 1 \ldots n-1} \left\\{ T(k) + T(n - k) + O(\min(k, n - k))\right\\}$$
93-
$T(n)$ is the number of operations necessary to obtain a tree with $n$ vertices by means of re-rooting and unifying trees.
94-
A tree of size $n$ can be created by combining two smaller trees of size $k$ and $n - k$.
95-
This recurrence is has the solution $T(n) = O (n \log n)$.
96-
97-
Thus, the total time spent on all re-rooting operations will be $O(n \log n)$ if we always re-root the smaller of the two trees.
98-
99-
We will have to maintain the size of each connected component, but the data structure DSU makes this possible without difficulty.
100-
101-
* Searching for the cycle formed by adding a new edge $(a, b)$.
102-
Since $a$ and $b$ are already connected in the tree we need to find the [Lowest Common Ancestor](lca.md) of the vertices $a$ and $b$.
103-
The cycle will consist of the paths from $b$ to the LCA, from the LCA to $b$ and the edge $a$ to $b$.
104-
105-
After finding the cycle we compress all vertices of the detected cycle into one vertex.
106-
This means that we already have a complexity proportional to the cycle length, which means that we also can use any LCA algorithm proportional to the length, and don't have to use any fast one.
107-
108-
Since all information about the structure of the tree is available is the ancestor array `par[]`, the only reasonable LCA algorithm is the following:
109-
mark the vertices $a$ and $b$ as visited, then we go to their ancestors `par[a]` and `par[b]` and mark them, then advance to their ancestors and so on, until we reach an already marked vertex.
110-
This vertex is the LCA that we are looking for, and we can find the vertices on the cycle by traversing the path from $a$ and $b$ to the LCA again.
111-
112-
It is obvious that the complexity of this algorithm is proportional to the length of the desired cycle.
113-
114-
* Compression of the cycle by adding a new edge $(a, b)$ in a tree.
115-
116-
We need to create a new 2-edge-connected component, which will consist of all vertices of the detected cycle (also the detected cycle itself could consist of some 2-edge-connected components, but this does not change anything).
117-
In addition it is necessary to compress them in such a way that the structure of the tree is not disturbed, and all pointers `par[]` and two DSUs are still correct.
118-
119-
The easiest way to achieve this is to compress all the vertices of the cycle to their LCA.
120-
In fact the LCA is the highest of the vertices, i.e. its ancestor pointer `par[]` remains unchanged.
121-
For all the other vertices of the loop the ancestors do not need to be updated, since these vertices simply cease to exists.
122-
But in the DSU of the 2-edge-connected components all these vertices will simply point to the LCA.
123-
124-
We will implement the DSU of the 2-edge-connected components without the Union by rank optimization, therefore we will get the complexity $O(\log n)$ on average per query.
125-
To achieve the complexity $O(1)$ on average per query, we need to combine the vertices of the cycle according to Union by rank, and then assign `par[]` accordingly.
73+
* Check whether the two vertices lie in the same connected / 2-edge-connected component.
74+
It is done with the usual DSU algorithm, we just find and compare the representatives of the DSUs.
75+
76+
* Joining two trees for some edge $(a, b)$.
77+
Since it could turn out that neither the vertex $a$ nor the vertex $b$ are the roots of their trees, the only way to connect these two trees is to re-root one of them.
78+
For example you can re-root the tree of vertex $a$, and then attach it to another tree by setting the ancestor of $a$ to $b$.
79+
80+
However the question about the effectiveness of the re-rooting operation arises:
81+
in order to re-root the tree with the root $r$ to the vertex $v$, it is necessary to necessary to visit all vertices on the path between $v$ and $r$ and redirect the pointers `par[]` in the opposite direction, and also change the references to the ancestors in the DSU that is responsible for the connected components.
82+
83+
Thus, the cost of re-rooting is $O(h)$, where $h$ is the height of the tree.
84+
You can make an even worse estimate by saying that the cost is $O(\text{size})$ where $\text{size}$ is the number of vertices in the tree.
85+
The final complexity will not differ.
86+
87+
We now apply a standard technique: we re-root the tree that contains fewer vertices.
88+
Then it is intuitively clear that the worst case is when two trees of approximately equal sizes are combined, but then the result is a tree of twice the size.
89+
This does not allow this situation to happen many times.
90+
91+
In general the total cost can be written in the form of a recurrence:
92+
93+
\[ T(n) = \max_{k = 1 \ldots n-1} \left\{ T(k) + T(n - k) + O(\min(k, n - k))\right\} \]
94+
95+
$T(n)$ is the number of operations necessary to obtain a tree with $n$ vertices by means of re-rooting and unifying trees.
96+
A tree of size $n$ can be created by combining two smaller trees of size $k$ and $n - k$.
97+
This recurrence is has the solution $T(n) = O (n \log n)$.
98+
99+
Thus, the total time spent on all re-rooting operations will be $O(n \log n)$ if we always re-root the smaller of the two trees.
100+
101+
We will have to maintain the size of each connected component, but the data structure DSU makes this possible without difficulty.
102+
103+
* Searching for the cycle formed by adding a new edge $(a, b)$.
104+
Since $a$ and $b$ are already connected in the tree we need to find the [Lowest Common Ancestor](lca.md) of the vertices $a$ and $b$.
105+
The cycle will consist of the paths from $b$ to the LCA, from the LCA to $b$ and the edge $a$ to $b$.
106+
107+
After finding the cycle we compress all vertices of the detected cycle into one vertex.
108+
This means that we already have a complexity proportional to the cycle length, which means that we also can use any LCA algorithm proportional to the length, and don't have to use any fast one.
109+
110+
Since all information about the structure of the tree is available is the ancestor array `par[]`, the only reasonable LCA algorithm is the following:
111+
mark the vertices $a$ and $b$ as visited, then we go to their ancestors `par[a]` and `par[b]` and mark them, then advance to their ancestors and so on, until we reach an already marked vertex.
112+
This vertex is the LCA that we are looking for, and we can find the vertices on the cycle by traversing the path from $a$ and $b$ to the LCA again.
113+
114+
It is obvious that the complexity of this algorithm is proportional to the length of the desired cycle.
115+
116+
* Compression of the cycle by adding a new edge $(a, b)$ in a tree.
117+
118+
We need to create a new 2-edge-connected component, which will consist of all vertices of the detected cycle (also the detected cycle itself could consist of some 2-edge-connected components, but this does not change anything).
119+
In addition it is necessary to compress them in such a way that the structure of the tree is not disturbed, and all pointers `par[]` and two DSUs are still correct.
120+
121+
The easiest way to achieve this is to compress all the vertices of the cycle to their LCA.
122+
In fact the LCA is the highest of the vertices, i.e. its ancestor pointer `par[]` remains unchanged.
123+
For all the other vertices of the loop the ancestors do not need to be updated, since these vertices simply cease to exists.
124+
But in the DSU of the 2-edge-connected components all these vertices will simply point to the LCA.
125+
126+
We will implement the DSU of the 2-edge-connected components without the Union by rank optimization, therefore we will get the complexity $O(\log n)$ on average per query.
127+
To achieve the complexity $O(1)$ on average per query, we need to combine the vertices of the cycle according to Union by rank, and then assign `par[]` accordingly.
126128

127129
## Implementation
128130

src/graph/bridge-searching.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -24,7 +24,7 @@ Now we have to learn to check this fact for each vertex efficiently. We'll use "
2424

2525
So, let $tin[v]$ denote entry time for node $v$. We introduce an array $low$ which will let us check the fact for each vertex $v$. $low[v]$ is the minimum of $tin[v]$, the entry times $tin[p]$ for each node $p$ that is connected to node $v$ via a back-edge $(v, p)$ and the values of $low[to]$ for each vertex $to$ which is a direct descendant of $v$ in the DFS tree:
2626

27-
$$low[v] = \min \begin{cases} tin[v] \\\\ tin[p]& \text{ for all }p\text{ for which }(v, p)\text{ is a back edge} \\\ low[to]& \text{ for all }to\text{ for which }(v, to)\text{ is a tree edge} \end{cases}$$
27+
$$low[v] = \min \begin{cases} tin[v] \\ tin[p]& \text{ for all }p\text{ for which }(v, p)\text{ is a back edge} \\ low[to]& \text{ for all }to\text{ for which }(v, to)\text{ is a tree edge} \end{cases}$$
2828

2929
Now, there is a back edge from vertex $v$ or one of its descendants to one of its ancestors if and only if vertex $v$ has a child $to$ for which $low[to] \leq tin[v]$. If $low[to] = tin[v]$, the back edge comes directly to $v$, otherwise it comes to one of the ancestors of $v$.
3030

src/graph/cutpoints.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -24,7 +24,7 @@ Now we have to learn to check this fact for each vertex efficiently. We'll use "
2424

2525
So, let $tin[v]$ denote entry time for node $v$. We introduce an array $low[v]$ which will let us check the fact for each vertex $v$. $low[v]$ is the minimum of $tin[v]$, the entry times $tin[p]$ for each node $p$ that is connected to node $v$ via a back-edge $(v, p)$ and the values of $low[to]$ for each vertex $to$ which is a direct descendant of $v$ in the DFS tree:
2626

27-
$$low[v] = \min \begin{cases} tin[v] \\\\ tin[p] &\text{ for all }p\text{ for which }(v, p)\text{ is a back edge} \\\ low[to]& \text{ for all }to\text{ for which }(v, to)\text{ is a tree edge} \end{cases}$$
27+
$$low[v] = \min \begin{cases} tin[v] \\ tin[p] &\text{ for all }p\text{ for which }(v, p)\text{ is a back edge} \\ low[to]& \text{ for all }to\text{ for which }(v, to)\text{ is a tree edge} \end{cases}$$
2828

2929
Now, there is a back edge from vertex $v$ or one of its descendants to one of its ancestors if and only if vertex $v$ has a child $to$ for which $low[to] < tin[v]$. If $low[to] = tin[v]$, the back edge comes directly to $v$, otherwise it comes to one of the ancestors of $v$.
3030

src/graph/depth-first-search.md

Lines changed: 28 additions & 28 deletions
Original file line numberDiff line numberDiff line change
@@ -26,34 +26,34 @@ For more details check out the implementation.
2626

2727
## Applications of Depth First Search
2828

29-
* Find any path in the graph from source vertex $u$ to all vertices.
30-
31-
* Find lexicographical first path in the graph from source $u$ to all vertices.
32-
33-
* Check if a vertex in a tree is an ancestor of some other vertex:
34-
35-
At the beginning and end of each search call we remember the entry and exit "time" of each vertex.
36-
Now you can find the answer for any pair of vertices $(i, j)$ in $O(1)$:
37-
vertex $i$ is an ancestor of vertex $j$ if and only if $\text{entry}[i] < \text{entry}[j]$ and $\text{exit}[i] > \text{exit}[j]$.
38-
39-
* Find the lowest common ancestor (LCA) of two vertices.
40-
41-
* Topological sorting:
42-
43-
Run a series of depth first searches so as to visit each vertex exactly once in $O(n + m)$ time.
44-
The required topological ordering will be the vertices sorted in descending order of exit time.
45-
46-
47-
* Check whether a given graph is acyclic and find cycles in a graph. (As mentioned above by counting back edges in every connected components).
48-
49-
* Find strongly connected components in a directed graph:
50-
51-
First do a topological sorting of the graph.
52-
Then transpose the graph and run another series of depth first searches in the order defined by the topological sort. For each DFS call the component created by it is a strongly connected component.
53-
54-
* Find bridges in an undirected graph:
55-
56-
First convert the given graph into a directed graph by running a series of depth first searches and making each edge directed as we go through it, in the direction we went. Second, find the strongly connected components in this directed graph. Bridges are the edges whose ends belong to different strongly connected components.
29+
* Find any path in the graph from source vertex $u$ to all vertices.
30+
31+
* Find lexicographical first path in the graph from source $u$ to all vertices.
32+
33+
* Check if a vertex in a tree is an ancestor of some other vertex:
34+
35+
At the beginning and end of each search call we remember the entry and exit "time" of each vertex.
36+
Now you can find the answer for any pair of vertices $(i, j)$ in $O(1)$:
37+
vertex $i$ is an ancestor of vertex $j$ if and only if $\text{entry}[i] < \text{entry}[j]$ and $\text{exit}[i] > \text{exit}[j]$.
38+
39+
* Find the lowest common ancestor (LCA) of two vertices.
40+
41+
* Topological sorting:
42+
43+
Run a series of depth first searches so as to visit each vertex exactly once in $O(n + m)$ time.
44+
The required topological ordering will be the vertices sorted in descending order of exit time.
45+
46+
47+
* Check whether a given graph is acyclic and find cycles in a graph. (As mentioned above by counting back edges in every connected components).
48+
49+
* Find strongly connected components in a directed graph:
50+
51+
First do a topological sorting of the graph.
52+
Then transpose the graph and run another series of depth first searches in the order defined by the topological sort. For each DFS call the component created by it is a strongly connected component.
53+
54+
* Find bridges in an undirected graph:
55+
56+
First convert the given graph into a directed graph by running a series of depth first searches and making each edge directed as we go through it, in the direction we went. Second, find the strongly connected components in this directed graph. Bridges are the edges whose ends belong to different strongly connected components.
5757

5858
## Classification of edges of a graph
5959

src/graph/desopo_pape.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -47,7 +47,7 @@ And of course, with each update in the array $d$ we also have to update the corr
4747

4848
We will use an array $m$ to store in which set each vertex is currently.
4949

50-
```cpp desopo_pape
50+
```{.cpp file=desopo_pape}
5151
struct Edge {
5252
int to, w;
5353
};

0 commit comments

Comments
 (0)
pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy