Example in comments, it doesn’t reply to posts directly. Basically, you mention
it like this: @ChatGPT@lemmings.world [https://lemmings.world/u/ChatGPT] and it
replies. Edit: Added details on how to use
I used this yesterday, without realizing it is unable to use the comment I am replying to as context and I looked real stupid in front of all my new Lemmy friends 😭 lmao
It would be kind of cool for the bot to consider the entire chain of comments up to and including one’s reply that pings the bot. Although perhaps that feature could be gated behind a command argument, like when commanding bot accounts from GitHub comments for CI/CD. That could gard against unintentional prompt injection or dilution of context with longer reply chains. Thoughts @rikudou@lemmings.world ?
I’d like that as well, but my problem with that is that it can rack up cost real quick because you’re billed per every token in the conversation, I once managed to incur cost of $5 in a single conversation of about 37 messages.
I’m planning for the bot to have the ability to set a custom api key, meaning every user could potentially provide their own api key and pay for their own responses and in that case using the whole chain is possible, but as long as I’m paying for all the responses of everyone on Lemmy, that’s unlikely to happen.
i didnt know you paid for it 😯. thank you 🥹 if I was in the position to contribute I would. for now I’ll just refrain from using to to drive up your costs
I used this yesterday, without realizing it is unable to use the comment I am replying to as context and I looked real stupid in front of all my new Lemmy friends 😭 lmao
It would be kind of cool for the bot to consider the entire chain of comments up to and including one’s reply that pings the bot. Although perhaps that feature could be gated behind a command argument, like when commanding bot accounts from GitHub comments for CI/CD. That could gard against unintentional prompt injection or dilution of context with longer reply chains. Thoughts @rikudou@lemmings.world ?
I’d like that as well, but my problem with that is that it can rack up cost real quick because you’re billed per every token in the conversation, I once managed to incur cost of $5 in a single conversation of about 37 messages.
I’m planning for the bot to have the ability to set a custom api key, meaning every user could potentially provide their own api key and pay for their own responses and in that case using the whole chain is possible, but as long as I’m paying for all the responses of everyone on Lemmy, that’s unlikely to happen.
i didnt know you paid for it 😯. thank you 🥹 if I was in the position to contribute I would. for now I’ll just refrain from using to to drive up your costs
Feel free to use it, that’s why I made it! I have measures in place to make sure I don’t spend more than I’m willing to.