I've been working on my own personal-assistant for a couple of months, just connected it to telegram so I can reach it from anywhere. Can "talk" to my calendar, run commands on my home computer, etc. It also has its own memory, so doesn't need huge prompt windows (I'm running a couple of fine-tuned curie models btw). Now I've been giving it API access to a bunch of stuff to increase its capabilities.
To keep the "memory", do you pass the embeddings along with the new text prompt in an API call? How do you combine embeddings and text prompts? I don't know much about this, sorry if the question sounds silly.
The below code takes a list of questions from an excel, and answers each one based on the directory I passed in. I use this for answering Statement of Works for proposals i write as a first path. Usually, I will have a number of different directorys that i pass in to 'Talk' to different intellegences and get a couple different answers for each prompt. One trains on the entire corpus of my past performance. One has a simple document discussing tone and other information, and one in training on only the SOW itself.
def excelGPT(dir, excel_file, sheet):
#my GPT Key
os.environ['OPENAI_API_KEY'] = 'sk-~Your open AI Key Here'
#Working Directory for training
root_folder = ''
documents = SimpleDirectoryReader(root_folder).load_data()
index = GPTSimpleVectorIndex(documents)
file_name = dir + excel_file
df = pd.read_excel(file_name, sheet_name=sheet)
answer_array = []
df_series = df.iloc[:,0]
for i,x in enumerate(df_series):
print("This is the index ", i)
print(x)
response = index.query(x)
answer_array.append(str(response))
zip_to_doc(df_series, answer_array, dir)
Hey, is it alright if you explain this in a bit more detail. I've playing around with llama-index myself. Do you have multiple indices? Or do you run each question through and get multiple responses. Isn't that quite expensive?
How do you also deal with the formatting of the various excel files. Would love to see the source code for this if you are willing to share?