This project is a Cold Email Generator that utilizes the Llama 3.1 model to generate responses. Due to the computational intensity of Llama 3.1, GroqCloud is used to expedite the response generation process, leveraging its LPU technology for faster performance.
- Llama 3.1: A powerful language model with 70 billion parameters for generating responses.
- GroqCloud: Provides faster execution of the Llama model using its LPU technology.
- Groq API Key: 'your_api_key'
- ChatGroq: Utilizes the LangChain framework to interact with the Llama model.
- ChromaDB: A lightweight vector database used for storing and retrieving data efficiently.
- LangChain: Framework used for creating and managing language model chains.
- WebBaseLoader: For web scraping to gather and process data.
- Pipellm: For forming and managing chains to pass data to the language model.
- Streamlit: For the User Interface.
- Python 3.x
- Pip (Python package installer)
-
Clone the repository:
git clone https://github.com/VidhiyaSB/Cold-Mail-Generator-using-Llama-3.1.git cd Cold-Mail-Generator-using-Llama-3.1
Ensure you have access to GroqCloud and set up the API key. This key should be kept secure. You can set it in your environment variables or a configuration file.
Add your Groq API key to a configuration file or environment variable
streamlit run .\app\main.py This script will use the Llama 3.1 model via GroqCloud and the LangChain framework to generate cold emails based on the input data.
Contributions are welcome! Please open an issue or submit a pull request with your changes.
This project is licensed under the MIT License - see the LICENSE file for details.
For any questions or issues, please contact me.