-
Notifications
You must be signed in to change notification settings - Fork 17
feat: Add basic markdown rendering for AI responses #21 #34
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: master
Are you sure you want to change the base?
Conversation
I do PR review it please. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
it’d be good to add a real example from actual AI output — like pipe a markdown-style response from the LLM and see how react-markdown renders it. also check if it breaks with any malformed markdown or ansi output (like **\x1b[31merror\x1b[0m**
)
src/components/Terminal.tsx
Outdated
const [messages] = useState<Message[]>([ | ||
{ | ||
type: 'llm', | ||
text: '**Welcome!**\n\n_Everything you type will render as Markdown._\n\nTry list:\n- Item 1\n- Item 2\n\nCode:\n```\necho hello\n```' | ||
} | ||
]); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
rn the messages
array is hardcoded — this just demos markdown rendering, but isn’t actually wired into the real terminal logic or AI response handling. can you hook this into the actual data flow where llm
messages are pushed during runtime?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
sure I'll do
review it again please!!! |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
appreciate the effort here - but this PR doesn’t really solve the issue as described.
the problem in #21 was about rendering markdown inside real AI responses in the terminal, not creating a demo component with static markdown.
a few things:
- you’ve added a new
Terminal.tsx
instead of working inside the actualcomponents/Terminal/index.tsx
, which handles real commands and LLM output - this new component isn’t used anywhere in the main app flow
App.tsx
was changed to use static props instead of integrating with actual response history
i’d suggest updating the real terminal output renderer (TerminalOutput.tsx
) to parse markdown only when the message type is 'llm', and integrate react-markdown
there.
let me know if you need help figuring out where exactly that happens — happy to point you to the right hook.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
this isn’t the actual terminal component - the real one is in components/Terminal/index.tsx
and handles input, history, command execution, and AI response handling. this new file isn’t used in the app and doesn’t integrate with anything.
return <Terminal />; | ||
return ( | ||
<div className="h-screen w-full bg-gray-950 text-white p-4 space-y-4"> | ||
{/* ✅ Use both to remove warning and test */} | ||
<Terminal aiResponse={exampleResponse} /> | ||
<Terminal aiResponse={malformedResponse} /> | ||
</div> | ||
); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
you’ve swapped the actual terminal for a demo version here - this breaks the real app flow. we shouldn’t be hardcoding markdown props into ; markdown rendering needs to work inside the real terminal output from the AI responses.
Changes
react-markdown
dependency.Terminal.tsx
for AI responses.package.json
and Tailwind configuration.Closes
Closes #21