16x Prompt
Suggest a feature
Short, descriptive title
Description
Create Post
Powered by
Insighto
streaming
streaming response
0
3
Auto-creation of context
The application provides only a file tree (path, size) as a context for the first request, and the AI agent through OpenAI Function calling / Anthropic tools, itself request the necessary files (the whole or just a header, for example) for the full context.
1
1
Check out shipped features in release note
The shipped features can be found in the release note of 16x Prompt: https://prompt.16x.engineer/release
0
1
Allow 'Saving' of a 'Response' so that it can be reviewed at a later time.
IN PROGRESS
Maybe this could be incorporated into the 'History' pane with an improved 'History Manager' interface or something :)
0
1
Code editing
IN PROGRESS
Ability to directly edit the code with preview and approval (accept / reject). Need to identify if the task is simple and possible.
0
0
local llm url
add url for ollama so can use api locally (instead of copy paste into local llm)
1
0
windows install warning
Windows displays a warning when installing the software (some windows defender smartscreen filter). It would be nice to get rid of this, as this gives a bad impression that the app could do malicious things.
1
0
Support Brew
Package your solution up to support installation via Brew.
0
0
Follup Up Message Textarea
- Larger follow up message textarea with more lines or at least an option to increase them - Fixed follow up message textarea so you can better scroll through the response and keep the textarea visible - Ability to swap follow up textarea to top or bottom
0
0
Sign up for Insighto with:
We use Insighto to collect feedback from users like you. Sign up to post and vote.
close