VS Code AI Coding Assistant that Outputs Step-by-Step Guidance, Not Direct Solutions
Website
ยท
Demo Slides
ยท
Demo Video
Unlike conventional AI coding assistants like GitHub Copilot, GenHint does not give you codes directly. Instead, it generates code templates with "TODO" comments and explains each subproblem for you. It's powered by Llama 3 70B hosted on Groq. Published as a VS Code Extension and easy to download, it will be a great companion for your coding workflow.
demo.mp4
๐ฅ Learning with unparalleled companionship: GenHint goes beyond simple guidance. Our approach provides step-by-step instructions, but we don't stop there. If you want to dive deeper into any part of a process, just select the step, and weโll provide detailed, contextual insights to ensure you fully grasp each concept. It's like having a mentor by your side, ready to explain each move.
๐ Instant response with no pressure: Thanks to Groq, we are available to help our users use the most advanced models at lightning speeds with no pressure on personal devices. Whether it is a workstation, or a thin and light, everyone can learn to code intelligently without worries.
Here's how to use our extension:
- Highlight a comment and type
cmd+shift+g
(MacOS)/ctrl+shift+g
(Windows) for structurizing (break down a problem into TODOs) - Highlight a comment and type
cmd+shift+e
/ctrl+shift+e
for elaboration (break down a TODO into specific subtasks) - Highlight a code region and type
cmd+shift+r
/ctrl+shift+r
for review (review finished codes)
We donโt just teach you the basics; we offer the opportunity to explore each instruction in detail. Whether youโre stuck on a specific step or simply curious to learn more, our AI-driven tool lets you uncover deeper explanations exactly when you need them, making learning intuitive and personalized.
For more examples, please refer to our project website
For more information on DevPost, please refer to DevPost website
- Implemented Structurize Module
- Implemented Elaboration Module
- Implemented Review Module
See the open issues for a full list of proposed features (and known issues).
Distributed under the MIT License. See LICENSE.txt
for more information.
Thanks to Warp and Groq for providing the computation platform for this project! If you want to build your own LLM-based tool, check out Groq and see if they could power your project!
We also thank MHacks for giving us this opportunity to meet with like-minded people and collaborate on this wonderful project, if you also want to have this experience, check out all that MHacks has to offer!