Skip to content

Commit c69fc85

Browse files
committed
Fix image links
1 parent f3cf188 commit c69fc85

File tree

4 files changed

+8
-8
lines changed

4 files changed

+8
-8
lines changed

pgml-cms/docs/introduction/import-your-data/README.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -16,7 +16,7 @@ If your intention is to use PostgresML as your primary database, your job here i
1616

1717
If your primary database is hosted elsewhere, for example AWS RDS, or Azure Postgres, you can get your data replicated to PostgresML in real time using logical replication.
1818

19-
<figure class="my-3 py-3"><img src="../../../.gitbook/assets/Getting-Started_Logical-Replication-Diagram.svg" alt="Logical replication" width="80%"><figcaption></figcaption></figure>
19+
<figure class="my-3 py-3"><img src="../../.gitbook/assets/Getting-Started_Logical-Replication-Diagram.svg" alt="Logical replication" width="80%"><figcaption></figcaption></figure>
2020

2121
Having access to your data immediately is very useful to
2222
accelerate your machine learning use cases and removes the need for moving data multiple times between microservices. Latency-sensitive applications should consider using this approach.
@@ -25,7 +25,7 @@ accelerate your machine learning use cases and removes the need for moving data
2525

2626
Foreign data wrappers are a set of PostgreSQL extensions that allow making direct connections from inside the database directly to other databases, even if they aren't running on Postgres. For example, Postgres has foreign data wrappers for MySQL, S3, Snowflake and many others.
2727

28-
<figure class="my-3 py-3"><img src="../../../.gitbook/assets/Getting-Started_FDW-Diagram.svg" alt="Foreign data wrappers" width="80%"><figcaption></figcaption></figure>
28+
<figure class="my-3 py-3"><img src="../../.gitbook/assets/Getting-Started_FDW-Diagram.svg" alt="Foreign data wrappers" width="80%"><figcaption></figcaption></figure>
2929

3030
FDWs are useful when data access is infrequent and not latency-sensitive. For many use cases, like offline batch workloads and not very busy websites, this approach is suitable and easy to get started with.
3131

pgml-cms/docs/introduction/import-your-data/foreign-data-wrappers.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -6,7 +6,7 @@ description: Connect your production database to PostgresML using Foreign Data W
66

77
Foreign data wrappers are a set of Postgres extensions that allow making direct connections to other databases from inside your PostgresML database. Other databases can be your production Postgres database on RDS or Azure, or another database engine like MySQL, Snowflake, or even an S3 bucket.
88

9-
<figure class="my-3 py-3"><img src="../../../.gitbook/assets/Getting-Started_FDW-Diagram.svg" alt="Foreign data wrappers" width="80%"><figcaption></figcaption></figure>
9+
<figure class="my-3 py-3"><img src="../../.gitbook/assets/Getting-Started_FDW-Diagram.svg" alt="Foreign data wrappers" width="80%"><figcaption></figcaption></figure>
1010

1111
## Getting started
1212

pgml-cms/docs/introduction/import-your-data/logical-replication/README.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -6,7 +6,7 @@ description: Stream data from your primary database to PostgresML in real time u
66

77
Logical replication allows your PostgresML database to copy data from your primary database to PostgresML in real time. As soon as your customers make changes to their data on your website, those changes will become available in PostgresML.
88

9-
<figure class="my-3 py-3"><img src="../../../../.gitbook/assets/Getting-Started_Logical-Replication-Diagram.svg" alt="Logical replication" width="80%"><figcaption></figcaption></figure>
9+
<figure class="my-3 py-3"><img src="../../../.gitbook/assets/Getting-Started_Logical-Replication-Diagram.svg" alt="Logical replication" width="80%"><figcaption></figcaption></figure>
1010

1111
## Getting started
1212

pgml-cms/docs/open-source/pgml/guides/chatbots/README.md

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -30,7 +30,7 @@ Here is an example flowing from:
3030

3131
text -> tokens -> LLM -> probability distribution -> predicted token -> text
3232

33-
<figure><img src="../../.gitbook/assets/Chatbots_Limitations-Diagram.svg" alt=""><figcaption><p>The flow of inputs through an LLM. In this case the inputs are "What is Baldur's Gate 3?" and the output token "14" maps to the word "I"</p></figcaption></figure>
33+
<figure><img src="../../../../.gitbook/assets/Chatbots_Limitations-Diagram.svg" alt=""><figcaption><p>The flow of inputs through an LLM. In this case the inputs are "What is Baldur's Gate 3?" and the output token "14" maps to the word "I"</p></figcaption></figure>
3434

3535
{% hint style="info" %}
3636
We have simplified the tokenization process. Words do not always map directly to tokens. For instance, the word "Baldur's" may actually map to multiple tokens. For more information on tokenization checkout [HuggingFace's summary](https://huggingface.co/docs/transformers/tokenizer\_summary).
@@ -108,11 +108,11 @@ What does an `embedding` look like? `Embeddings` are just vectors (for our use c
108108
embedding_1 = embed("King") # embed returns something like [0.11, -0.32, 0.46, ...]
109109
```
110110

111-
<figure><img src="../../.gitbook/assets/Chatbots_King-Diagram.svg" alt=""><figcaption><p>The flow of word -> token -> embedding</p></figcaption></figure>
111+
<figure><img src="../../../../.gitbook/assets/Chatbots_King-Diagram.svg" alt=""><figcaption><p>The flow of word -> token -> embedding</p></figcaption></figure>
112112

113113
`Embeddings` aren't limited to words, we have models that can embed entire sentences.
114114

115-
<figure><img src="../../.gitbook/assets/Chatbots_Tokens-Diagram.svg" alt=""><figcaption><p>The flow of sentence -> tokens -> embedding</p></figcaption></figure>
115+
<figure><img src="../../../../.gitbook/assets/Chatbots_Tokens-Diagram.svg" alt=""><figcaption><p>The flow of sentence -> tokens -> embedding</p></figcaption></figure>
116116

117117
Why do we care about `embeddings`? `Embeddings` have a very interesting property. Words and sentences that have close [semantic similarity](https://en.wikipedia.org/wiki/Semantic\_similarity) sit closer to one another in vector space than words and sentences that do not have close semantic similarity.
118118

@@ -157,7 +157,7 @@ print(context)
157157

158158
There is a lot going on with this, let's check out this diagram and step through it.
159159

160-
<figure><img src="../../.gitbook/assets/Chatbots_Flow-Diagram.svg" alt=""><figcaption><p>The flow of taking a document, splitting it into chunks, embedding those chunks, and then retrieving a chunk based off of a users query</p></figcaption></figure>
160+
<figure><img src="../../../../.gitbook/assets/Chatbots_Flow-Diagram.svg" alt=""><figcaption><p>The flow of taking a document, splitting it into chunks, embedding those chunks, and then retrieving a chunk based off of a users query</p></figcaption></figure>
161161

162162
Step 1: We take the document and split it into chunks. Chunks are typically a paragraph or two in size. There are many ways to split documents into chunks, for more information check out [this guide](https://www.pinecone.io/learn/chunking-strategies/).
163163

0 commit comments

Comments
 (0)
pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy