CivArchive
    Preview undefined
    Preview undefined
    Preview undefined
    Preview undefined

    Here you find the vector data that you would need if you like to play with my latest baby.

    It is called Prompt Quill and it will help you make nice prompts more easy. Its not just a dull Prompt helper you might know, it is the world's first RAG workflow feeded with more than 4.9 Million Prompts I did take from Civitai and other sources with their permission.

    The data is now prepared to deliver negative prompts as well as models that might work well with the generated prompt. Also there is a one click installer to all versions, also all versions now support image generation with the created prompts, so its really getting time you start trying it =)

    Find the sources you need here: https://github.com/osi1880vr/prompt_quill

    If you like it please leave a thumbs up =)

    You like to contribute, just raise an PR on github.

    There is new features, it now allows for a deep dive into the context aka found prompts from the vector store, you can run batches of prompt generation and you can enter your prompt in your native language, it will then get translated into english and processed from there, the translated prompt will also be shown in the output so you can have an idea what it translated to.

    You like to get in contact PM me here or find me on discord: https://discord.gg/Krn9UutdGH

    One last thing, I'm very interested to get this thing to life on some server, if you interested in sponsoring a long term hosting solution please lets talk :)

    Description

    This is >3.2M prompts in the vectors for the llama_index Version, the new added prompts to not come with negative nore with model information, so you might find there does not come any negative anymore, this can happen please don't mind

    FAQ

    Comments (13)

    wooblyMar 24, 2024
    CivitAI

    How can I install the new 3.2M llama-index snapshot? I tried reinstalling from scratch with the one-click, but it seems to have downloaded the ~6gb snapshot instead of the new ~10gb one.

    Also, how can I swap out the gguf model for a custom one? The 13B 5_K_M quant is just slightly too big for my 12gb of vram, making it much slower than it could be with a smaller quant.

    osi1880vr
    Author
    Mar 24, 2024· 1 reaction

    The one click installer still goes for the smaller thing, you can just download the bigger one and put it into your qdrant. Just delete the existing collection and create a new one with the same name and upload the larger snapshot. the qdrant UI you can find at http://localhost:6333/dashboard. If you just reinstalled it you might have the latest version from GH? There the settings for the model are in the file settings/defaults.py or in settings/settings.dat depending on if you already did save the settings once, there you find the model_list and you can add your model just as additional model, once selected thats now saved so next time it will load your model. if you find any cool smaller model that does a good job please let me know and I make it the default as youre right the 13b one is quite large but it gave me the best results during my still limited tests with different llms

    osi1880vr
    Author
    Mar 24, 2024· 1 reaction

    You know what =) I just updated the installer to fetch the larger file

    razunterJun 16, 2024

    Not all installers\"prepare cache" download newer file

    osi1880vr
    Author
    Jun 17, 2024

    @razunter Oh good point =) I fix that

    StantheBrainMar 25, 2024
    CivitAI

    In the sentence: "the laboratory of a mad scientist", the request is for a laboratory. And Foocus will create the image of a laboratory probably belonging to a mad scientist.

    If I use Prompt Quill, the prompt "the laboratory of a mad scientist" will become: the mad scientist of a laboratory (in short)! And Foocus will create an image of a probable mad scientist in a laboratory.

    It's not perfect!

    osi1880vr
    Author
    Mar 26, 2024

    Where in the description it tells it is perfect?

    StantheBrainApr 18, 2024

    @osi1880vr That's just it, it's not written in the description that it's not perfect!

    So I added it in the comments osi1880vr! :)

    And to justify the omission, it's not perfect it can understand in reverse and generate something other than the original request, I'll give you an example:

    In the sentence: "the laboratory of a mad scientist", the request is for a laboratory. And Fooocus will create the image of a laboratory probably belonging to a mad scientist.

    If I use Prompt Quill, the prompt "the laboratory of a mad scientist" will become: the mad scientist of a laboratory (in short)! And SD will create an image of a probable mad scientist in a laboratory.

    It's not perfect!

    Ps: this is a constructive criticism intended to take into account that Prompt Quill can understand backwards and generate something other than the request (lab/savant).

    osi1880vr
    Author
    Apr 19, 2024

    @StantheBrain I get your point, the results are based on what it finds in the data, and then it makes something new out of it. to understand your initial intent is a much harder task than search the data and let it play with that. If you are good at such tasks you are welcome to help me to get it implemented, my brain is just to old and to small to be able to do that.

    This will not solve the issue as such, but on the model settings you can set the temperature. This will lets the LLM more freedom in what it does, maybe that way it gets more close to what you expect. Also the character tab gives you some control about how it will react, maybe it helps if you add something like "keep the initial intent of my query intact"

    Its a toy rather than a scientific solution, and it is meant for you to play with it. So play around and tell what you found.

    I still get your point and I will think about it, but the size of my brain predicts this can take a while =)

    DeeTenFMar 26, 2024
    CivitAI

    tried pressing like button, but website is buggy. So, consider this the like lol

    osi1880vr
    Author
    Mar 27, 2024

    I thank you very much for trying =)

    DeeTenFMar 28, 2024

    I have to take back my previous comment, im pretty sure I somehow destroyed my browser with custom extensions and broke pretty much everything XD cant blame civit website this time. also thanks for the prompt response on reddit question, nice to see creators active on all their linked socials! excited to try this, after skimming through github & comments here I was under the impression I needed either docker or linux, glad to hear its not the case!

    osi1880vr
    Author
    Mar 28, 2024

    If you are on windows you can use the one click installer which will do the whole thing for you without the need of docker

    Other
    Other

    Details

    Downloads
    316
    Platform
    CivitAI
    Platform Status
    Available
    Created
    3/23/2024
    Updated
    5/12/2026
    Deleted
    -

    Files

    promptQuillVector_lli32MV11.zip

    Mirrors

    Available On (1 platform)

    Same model published on other platforms. May have additional downloads or version variants.