|
1 | 1 | # Finetuning Llama3 using Unsloth
|
2 | 2 |
|
3 |
| -This notebook will guide you step by step on how to train LLMs for text2cypher purposes (in this case, llama3, but you could use other models too). |
| 3 | +We have two notebooks here: |
| 4 | + |
| 5 | +## Using simple prompt template |
| 6 | + |
| 7 | +* Filename: `llama3_text2cypher_simple.ipynb` |
| 8 | +* Contributed by: [Geraldus Wilsen](https://github.com/projectwilsen/) |
| 9 | +* Dataset: synthetic_gpt4turbo_demodbs |
| 10 | +* Originally published: https://github.com/projectwilsen/KnowledgeGraphLLM |
| 11 | + |
| 12 | +This notebook uses simple prompt completion template to finetune Llama3 to construct Cypher statements on a single database (recommendations). |
4 | 13 |
|
5 | 14 | For more information, you could watch this video tutorial: https://www.youtube.com/watch?v=7VU-xWJ39ng
|
6 | 15 |
|
7 |
| -Explore other use case and tutorials: https://github.com/projectwilsen/KnowledgeGraphLLM |
| 16 | +## Using chat prompt template |
| 17 | + |
| 18 | +* Filename: `llama3_text2cypher_chat.ipynb` |
| 19 | +* Contributed by: [Tomaz Bratanic](https://github.com/tomasonjo) |
| 20 | +* Dataset: synthetic_gpt4o_demodbs |
| 21 | +* HuggingFace model: https://huggingface.co/tomasonjo/text2cypher-demo-16bit |
| 22 | +* Ollama model: https://ollama.com/tomasonjo/llama3-text2cypher-demo |
| 23 | + |
| 24 | +This notebook uses chat prompt template (system,user,assistant) to finetune Llama3 to construct Cypher statements on 16 different graph databases available on demo server. |
| 25 | + |
| 26 | +You can load use it in LangChain using the following code. |
| 27 | +First load the model using Ollama and install dependencies. |
| 28 | + |
| 29 | +```bash |
| 30 | +ollama pull tomasonjo/llama3-text2cypher-demo |
| 31 | +pip install langchain langchain-community neo4j |
| 32 | +``` |
| 33 | + |
| 34 | +Now you can use the following code to generate Cypher statements with LangChain: |
| 35 | + |
| 36 | +```python |
| 37 | +from langchain_community.graphs import Neo4jGraph |
| 38 | +from langchain_community.chat_models import ChatOllama |
| 39 | +from langchain_core.prompts import ChatPromptTemplate |
| 40 | + |
| 41 | +DEMO_URL = "neo4j+s://demo.neo4jlabs.com" |
| 42 | +DATABASE = "recommendations" |
| 43 | + |
| 44 | +graph = Neo4jGraph( |
| 45 | + url=DEMO_URL, |
| 46 | + database=DATABASE, |
| 47 | + username=DATABASE, |
| 48 | + password=DATABASE, |
| 49 | + enhanced_schema=True, |
| 50 | + sanitize=True, |
| 51 | +) |
| 52 | +llm = ChatOllama(model="tomasonjo/llama3-text2cypher-demo") |
| 53 | +prompt = ChatPromptTemplate.from_messages( |
| 54 | + [ |
| 55 | + ( |
| 56 | + "system", |
| 57 | + "Given an input question, convert it to a Cypher query. No pre-amble.", |
| 58 | + ), |
| 59 | + ( |
| 60 | + "human", |
| 61 | + ( |
| 62 | + "Based on the Neo4j graph schema below, write a Cypher query that would answer the user's question: " |
| 63 | + "\n{schema} \nQuestion: {question} \nCypher query:" |
| 64 | + ), |
| 65 | + ), |
| 66 | + ] |
| 67 | +) |
| 68 | +chain = prompt | llm |
| 69 | + |
| 70 | +question = "How many movies did Tom Hanks play in?" |
| 71 | +response = chain.invoke({"question": question, "schema": graph.schema}) |
| 72 | +print(response.content) |
| 73 | +``` |
0 commit comments